CN115542941A - Control method and system for guiding unmanned aerial vehicle to land accurately - Google Patents

Control method and system for guiding unmanned aerial vehicle to land accurately Download PDF

Info

Publication number
CN115542941A
CN115542941A CN202211233346.7A CN202211233346A CN115542941A CN 115542941 A CN115542941 A CN 115542941A CN 202211233346 A CN202211233346 A CN 202211233346A CN 115542941 A CN115542941 A CN 115542941A
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
image
landing
landing platform
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211233346.7A
Other languages
Chinese (zh)
Inventor
王泰花
吕建红
乔耀华
杨杰
蔡俊鹏
吴见
张韶元
周长明
孙磊
詹晓宇
苑雨薇
于晓艳
张鹤赢
董庆
张龙龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
State Grid Intelligent Technology Co Ltd
Original Assignee
State Grid Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by State Grid Intelligent Technology Co Ltd filed Critical State Grid Intelligent Technology Co Ltd
Priority to CN202211233346.7A priority Critical patent/CN115542941A/en
Publication of CN115542941A publication Critical patent/CN115542941A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a control method and a control system for guiding an unmanned aerial vehicle to accurately land, wherein the control method comprises the following steps: acquiring images of a landing platform at set intervals in the landing process of the unmanned aerial vehicle; detecting the brightness and contrast of the image, judging whether a set threshold value is met, and if so, directly carrying out image recognition; otherwise, carrying out image equalization processing by using an image correction algorithm; acquiring the position of the landing platform visual mark relative to the unmanned aerial vehicle under the camera coordinate system, and further determining the position information of the unmanned aerial vehicle in the landing platform visual mark coordinate system; based on the error of position information and expectation landing point position, consider external disturbance, adjust the unmanned aerial vehicle position, until with error adjustment to the within range of setting for, control unmanned aerial vehicle descends. According to the invention, the light supplement equipment is arranged on the landing platform, so that the positioning precision of the visual mark of the landing platform when the unmanned aerial vehicle lands is improved.

Description

Control method and system for guiding unmanned aerial vehicle to land accurately
Technical Field
The invention relates to the technical field of unmanned aerial vehicle inspection, in particular to a control method and a control system for guiding an unmanned aerial vehicle to land accurately.
Background
The statements in this section merely provide background information related to the present disclosure and may not necessarily constitute prior art.
In recent years, with the continuous increase of the scale of power grids and the rapid increase of the complexity of lines, unmanned planes have gradually become important tools for line inspection. At unmanned aerial vehicle patrols and examines the operation within range net ization and disposes the airport, can realize that the unmanned aerial vehicle circuit independently patrols and examines, nevertheless along with the electric wire netting patrols and examines the increase of traffic day by day, unmanned aerial vehicle all-weather intelligence independently patrols and examines and become the new demand that the electric wire netting circuit was patrolled and examined, and it is then crucial that this in-process guide unmanned aerial vehicle accurate landing at night is to the airport.
After the general unmanned aerial vehicle has executed the task of patrolling and examining, unmanned aerial vehicle receives the instruction of returning to the airport to begin to return, when unmanned aerial vehicle arrived the airport overhead and arrived and predetermine the descending height, unmanned aerial vehicle began accurate descending flow. However, when the unmanned aerial vehicle lands on the airport at night or in the case of weak light, the unmanned aerial vehicle camera cannot identify the visual sign of the landing platform of the airport, and the unmanned aerial vehicle starts to land only by positioning the RTK, wherein the RTK information acquisition frequency is low in the landing process, the unmanned aerial vehicle position and posture is adjusted slowly, and in addition, the unmanned aerial vehicle is easy to deviate from a preset landing point due to the influence of disturbance of environmental factors such as wind and the like, so that the landing precision is relatively low.
Disclosure of Invention
In order to solve the problems, the invention provides a control method and a control system for guiding an unmanned aerial vehicle to accurately land.
In some embodiments, the following technical scheme is adopted:
a control method for guiding an unmanned aerial vehicle to accurately land comprises the following steps:
acquiring images of a landing platform at set intervals in the landing process of the unmanned aerial vehicle;
detecting the brightness and contrast of the image, judging whether a set threshold value is met, and if so, directly carrying out image recognition; otherwise, carrying out image equalization processing by using an image correction algorithm;
acquiring the position of the landing platform visual mark relative to the unmanned aerial vehicle under the camera coordinate system, and further determining the position information of the unmanned aerial vehicle in the landing platform visual mark coordinate system;
based on the position information and the error of the expected landing point position, the position of the unmanned aerial vehicle is adjusted by considering external disturbance until the error is adjusted to the set range, and the unmanned aerial vehicle is controlled to land.
As a further scheme, the landing platform is provided with a light supplementing device for improving the brightness of the visual mark of the landing platform;
before the unmanned aerial vehicle returns to the air above the airport landing platform and starts automatic landing, detecting the brightness of the acquired visual sign image, and if the image brightness is lower than a set first threshold value, starting a light supplementing device;
after the light supplementing equipment is started, detecting the brightness and the contrast of the visual sign image, and when the brightness of the visual sign image is lower than a set second threshold value, adjusting the brightness of the light supplementing equipment by adjusting the PWM duty ratio of a control signal so that the image brightness is not lower than the set second threshold value;
when the brightness signal adjustment value of the light supplement equipment is greater than the adjustable upper limit value or less than the adjustable lower limit value, the brightness of the light supplement equipment is not adjusted any more; if the brightness of the image does not meet the set threshold value, the image correction algorithm performs image equalization processing.
As a further scheme, before the image of the landing platform is collected, calibrating the camera of the unmanned aerial vehicle; the specific process is as follows:
determining the number of the angular points of the calibration board and the actual size of each checkerboard;
shooting calibration plate pictures in different directions and angles by using an unmanned aerial vehicle camera to obtain a group of images;
detecting the characteristic points in the calibration board image, acquiring the pixel coordinates of the corner points of the calibration board, and calculating the physical coordinate values of the corner points of the calibration board according to the actual size of the checkerboard and the coordinates of a world coordinate system;
calculating and acquiring an internal reference matrix I and a distortion parameter matrix D of the camera according to the obtained relationship between the angular point physical coordinate values and the angular point pixel coordinate values; and optimizing the internal parameter matrix I and the distortion parameter matrix D.
As a further scheme, an image correction algorithm is used for image equalization, and the specific process is as follows:
converting the collected RGB image into a gray image, and calculating the pixel gray level of the gray image;
normalizing the image gray level, and calculating the cumulative distribution function of a gray histogram;
and performing image inverse transformation based on the cumulative distribution function to obtain an image after image equalization.
As a further scheme, the position of the landing platform visual mark relative to the unmanned aerial vehicle under the camera coordinate system is obtained, and then the position information of the unmanned aerial vehicle in the landing platform visual mark coordinate system is determined, and the specific process is as follows:
acquiring external parameters of a camera based on an internal reference matrix calibrated by an unmanned aerial vehicle camera to obtain a rotation matrix R and a translation matrix T;
acquiring coordinates of the visual mark of the landing platform under a camera coordinate system, and establishing a coordinate conversion relation between the unmanned aerial vehicle under the camera coordinate system and the visual mark coordinate system of the landing platform by using the rotation matrix R and the translation matrix T;
and calculating to obtain the coordinates of the unmanned aerial vehicle under the coordinate system of the visual mark of the landing platform based on the coordinates of the unmanned aerial vehicle under the coordinate system of the camera.
As a further scheme, based on the error between the position information and the expected landing point position, the position of the unmanned aerial vehicle is adjusted by considering external disturbance, and the specific process includes:
determining the landing point position of the unmanned aerial vehicle in the visual landing platform, and acquiring the coordinates of the landing point position in a landing platform visual marker coordinate system;
calculating a landing error based on the coordinates of the unmanned aerial vehicle in the landing platform visual marker coordinate system and the coordinates of the landing point position in the landing platform visual marker coordinate system;
and determining the control quantity of the landing error through PID control, adding a control compensation item in the control quantity, and adjusting the position of the unmanned aerial vehicle by using the obtained control quantity.
As a further scheme, the control compensation item is determined according to the position error calculated by the pose calculation at the current time and the position error calculated by the pose calculation at the previous time.
In other embodiments, the following technical solutions are adopted:
a control system for guiding a drone to land accurately, comprising:
the image acquisition module is used for acquiring images of the landing platform at set intervals in the landing process of the unmanned aerial vehicle;
the image preprocessing module is used for detecting the brightness and the contrast of the image, judging whether a set threshold value is met, and if the set threshold value is met, directly identifying the image; otherwise, carrying out image equalization processing by using an image correction algorithm;
the position conversion module is used for acquiring the position of the visual mark of the landing platform relative to the unmanned aerial vehicle under the camera coordinate system, and further determining the position information of the unmanned aerial vehicle in the visual mark coordinate system of the landing platform;
and the error control module is used for adjusting the position of the unmanned aerial vehicle based on the position information and the error of the expected landing point position by considering external disturbance until the error is adjusted to a set range, and controlling the unmanned aerial vehicle to land.
In other embodiments, the following technical solutions are adopted:
a terminal device comprising a processor and a memory, the processor being arranged to implement instructions; the memory is used for storing a plurality of instructions, and the instructions are suitable for being loaded by the processor and executing the control method for guiding the unmanned aerial vehicle to accurately land.
In other embodiments, the following technical solutions are adopted:
a computer-readable storage medium, wherein a plurality of instructions are stored, and the instructions are suitable for being loaded by a processor of a terminal device and executing the control method for guiding the unmanned aerial vehicle to accurately land.
Compared with the prior art, the invention has the beneficial effects that:
(1) The invention innovatively provides a control method for guiding an unmanned aerial vehicle to accurately land, the light supplementing device is arranged on the landing platform, so that the position of the visual mark of the landing platform of the airport can be clearly displayed through illumination at night or under the condition of weak light, the positioning precision of the visual mark of the landing platform when the unmanned aerial vehicle lands is improved, the problem that the position of the visual mark of the landing platform cannot be obtained at night and only the RTK positioning can be relied on is solved, and the precision of the unmanned aerial vehicle landing at night is improved.
(2) The invention innovatively provides an image equalization processing method, which is characterized in that image preprocessing is added after a visual mark image is obtained, the brightness or contrast of the image is adjusted, and the accuracy of image identification is improved; the problem that image recognition is affected by dark images obtained at night, low image contrast and the like is solved.
Additional features and advantages of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
Drawings
Fig. 1 is a flowchart of a control method for guiding an unmanned aerial vehicle to accurately land in an embodiment of the present invention;
fig. 2 is a schematic view illustrating a disassembled structure of the light supplement device for the landing platform according to the embodiment of the present invention;
fig. 3 is a top view of the light supplement device of the landing platform in the embodiment of the invention.
Detailed Description
It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the disclosure. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments according to the present application. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
Example one
In one or more embodiments, a control method for guiding an unmanned aerial vehicle to land accurately is disclosed, which specifically includes the following processes with reference to fig. 1:
(1) Acquiring images of a landing platform at set intervals in the landing process of the unmanned aerial vehicle;
in this embodiment, combine fig. 2 and fig. 3, set up the light filling device at the descending platform, the light filling device is the light filling lamp, select the even backlight source of light, adopt the light filling scheme of shining in descending platform vision mark bottom, according to the size of unmanned aerial vehicle descending platform, with the installation of light filling light source and fix the vision mark bottom on the descending platform, wherein the light source circuit takes the mode of punching at the descending platform, with line connection to controller, in order to make things convenient for the light source to open and close.
When the unmanned aerial vehicle navigates back to the place above the airport and starts to automatically land, a fine landing process is triggered, the image brightness of the visual mark is detected, and if the image brightness is lower than a set first threshold value (a light supplement lamp starting threshold value), a light supplement lamp of an airport landing platform is started; after the light supplement lamp is started, detecting the brightness and contrast of an image, when the image brightness is lower than a set second threshold (the image brightness is lower than the threshold and is set according to the identification requirement of an actual project), firstly adjusting the brightness of the light supplement lamp, issuing a light supplement lamp brightness adjusting signal through an airport, adjusting the PWM duty ratio of the light supplement lamp to adjust the image brightness, and detecting the brightness; when the adjustment value of the luminance signal of the fill-in light is smaller than the adjustable lower limit value or higher than the adjustable upper limit value, the luminance of the fill-in light is not adjusted any more. If the brightness of the image does not satisfy the set threshold (threshold for image equalization), the image correction algorithm performs image equalization. And the lower limit value or the upper limit value of the brightness signal adjustment value of the supplementary lighting lamp is determined by the model of the selected supplementary lighting lamp.
According to descending control flow computing time, design unmanned aerial vehicle camera image acquisition time interval, the image of landing platform visual sign is constantly gathered to the unmanned aerial vehicle descending in-process to constantly adjust the unmanned aerial vehicle position, guide unmanned aerial vehicle descends.
(2) Detecting the brightness and contrast of the image, judging whether a set threshold is met, and if so, directly carrying out image recognition; otherwise, carrying out image equalization processing by using an image correction algorithm;
in the embodiment, after the image of the landing platform visual mark acquired by the unmanned aerial vehicle is acquired, the brightness and the contrast of the image are detected, and when the set threshold is met, the identification and the detection can be directly carried out; when the set threshold value is not met, carrying out equalization processing on the acquired image; the specific treatment process is as follows:
1) Converting the collected RGB image into a gray-scale image;
2) Calculating the pixel gray level m of the gray image;
3) Normalizing the image gray level, wherein 0 represents black and 1 represents white; using the probability density p m (m) represents the distribution of image gray levels:
Figure BDA0003882487890000071
4) Calculating a cumulative distribution function of the gray histogram:
Figure BDA0003882487890000072
where k denotes the kth gray level of the image, l denotes the gray level of the image, m k Representing the probability of the occurrence of the current gray level, n k Representing a grey level of m k Number of pixels of(s) k The value of the current gray level cumulative distribution function after mapping, n is the sum of pixels in the image, n j The number of pixels of the current gray level.
5) Carrying out image inverse transformation to obtain an image m after image equalization k
m k =T -1 (s k ) (3)
(3) Acquiring the position of a visual mark of the landing platform relative to the unmanned aerial vehicle under a camera coordinate system, and further determining the position information of the unmanned aerial vehicle in the visual mark coordinate system of the landing platform;
in the embodiment, before the unmanned aerial vehicle camera acquires an image, firstly, a Zhang Yongyou camera calibration method is adopted to calibrate the camera, and camera parameters are acquired; the specific calibration method comprises the following steps:
1) Preparing a camera calibration board, and determining the number of angular points of the calibration board and the actual size of each checkerboard;
2) Shooting calibration plate pictures in different directions and angles by using an unmanned aerial vehicle camera to obtain a group of images;
3) Detecting the characteristic points in the calibration board image, acquiring the pixel coordinates of the corner points of the calibration board, and calculating the physical coordinate values of the corner points of the calibration board according to the actual size of the checkerboard and the coordinates of a world coordinate system;
4) Calculating and acquiring an internal reference matrix I and a distortion parameter matrix D of the camera according to the obtained relationship between the angular point physical coordinate values and the angular point pixel coordinate values;
5) And optimizing the camera intrinsic parameters I and the distortion parameters D by using opencv.
After calibration is completed, acquiring the image preprocessed in the step (2), and visually recognizing the landing platform visual mark in the image, wherein the specific process is as follows:
1) Acquiring external parameters of a camera based on an internal reference matrix calibrated by an unmanned aerial vehicle camera to obtain a rotation matrix R and a translation matrix T;
2) Obtaining coordinates of the visual mark of the landing platform under a camera coordinate system, and establishing a coordinate conversion relation of the unmanned aerial vehicle under the camera coordinate system and the visual mark coordinate system of the landing platform by using a rotation matrix R and a translation matrix T:
Figure BDA0003882487890000081
wherein the content of the first and second substances,
Figure BDA0003882487890000091
is the position coordinates of the drone in the camera coordinate system,
Figure BDA0003882487890000092
is the position coordinate of the unmanned aerial vehicle in the visual mark coordinate system.
3) Based on the coordinates of the unmanned aerial vehicle under the camera coordinate system, the coordinates of the unmanned aerial vehicle under the landing platform visual mark coordinate system are calculated, and the method specifically comprises the following steps:
Figure BDA0003882487890000093
and the coordinates of the unmanned aerial vehicle under the camera coordinate system are obtained by a translation matrix result automatically returned after apriltag identification. The Apriltag visual markers used depend on the Apriltag open source library for identification. Apriltag identification can be carried out through the open source library, after identification is successful, the corresponding ID, the coordinates of the visual mark corner points, the rotation matrix and the displacement matrix are returned, and the coordinates of the unmanned aerial vehicle under a camera coordinate system can be obtained through the obtained displacement matrix.
(4) Based on the coordinate position information of the unmanned aerial vehicle under the visual marker coordinate system of the landing platform and the error of the expected landing point position, the position of the unmanned aerial vehicle is adjusted by considering external disturbance until the error is adjusted to the set range, and the unmanned aerial vehicle is controlled to land.
In this embodiment, first, the landing point information of the unmanned aerial vehicle in the landing platform is determined, and the coordinate information of the unmanned aerial vehicle in the visual marker coordinate system is obtained
Figure BDA0003882487890000094
Then, based on the coordinates of the unmanned aerial vehicle in the visual marker coordinate system, acquiring the error e between the real-time position information of the unmanned aerial vehicle and the position of the expected landing point 3*1
Figure BDA0003882487890000095
Designing a PID controller, and controlling the following results:
res = P error + I error integral + D error derivative
Meanwhile, in the unmanned aerial vehicle landing process, the position of the unmanned aerial vehicle may not reach the expected effect due to the external disturbance which is easy to receive, so that a compensation item delta is added to the control quantity of the unmanned aerial vehicle, wherein the compensation item is calculated as follows:
setting the position error of the pose calculation at the current time as e k The position error calculated by the pose calculation at the previous moment is e k-1 And then:
Figure BDA0003882487890000101
wherein k is greater than 0, and the value of k is selected according to the actual situation.
Calculating the control quantity, and adjusting the position of the unmanned aerial vehicle: control = res + Δ.
Judge whether allow unmanned aerial vehicle to descend, whether satisfy the tolerance promptly, satisfy then allow unmanned aerial vehicle to descend, unsatisfying then need unmanned aerial vehicle to continue the adjustment position to guarantee that unmanned aerial vehicle accuracy descends to airport landing platform.
According to the method, the unmanned aerial vehicle can accurately acquire the position of the visual mark in the landing platform by adding the light supplementing device to the landing platform; and the brightness or contrast of the image is adjusted by an image equalization processing method, the accuracy of image identification is improved, and the problems that the image identification is influenced by dark image, low image contrast and the like are collected at night are solved.
Example two
In one or more embodiments, a control system for guiding a drone to land accurately is disclosed, comprising:
the image acquisition module is used for acquiring images of the landing platform at set intervals in the landing process of the unmanned aerial vehicle;
the image preprocessing module is used for detecting the brightness and the contrast of the image, judging whether a set threshold value is met, and if the set threshold value is met, directly identifying the image; otherwise, carrying out image equalization processing by using an image correction algorithm;
the position conversion module is used for acquiring the position of the visual mark of the landing platform relative to the unmanned aerial vehicle under the camera coordinate system, and further determining the position information of the unmanned aerial vehicle in the visual mark coordinate system of the landing platform;
and the error control module is used for adjusting the position of the unmanned aerial vehicle based on the position information and the error of the expected landing point position by considering external disturbance until the error is adjusted to a set range, and controlling the unmanned aerial vehicle to land.
It should be noted that, the specific implementation of each module described above has been described in the first embodiment, and is not described in detail here.
EXAMPLE III
In one or more embodiments, a terminal device is disclosed, which includes a server, where the server includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor implements the control method for guiding the precise landing of the drone in the first embodiment when executing the program. For brevity, further description is omitted herein.
It should be understood that in this embodiment, the processor may be a central processing unit CPU, and the processor may also be other general purpose processor, a digital signal processor DSP, an application specific integrated circuit ASIC, an off-the-shelf programmable gate array FPGA or other programmable logic device, a discrete gate or transistor logic device, a discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory may include both read-only memory and random access memory, and may provide instructions and data to the processor, and a portion of the memory may also include non-volatile random access memory. For example, the memory may also store device type information.
In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or by instructions in the form of software.
Example four
In one or more embodiments, a computer-readable storage medium is disclosed, in which a plurality of instructions are stored, and the instructions are adapted to be loaded by a processor of a terminal device and execute the control method for guiding a drone to accurately land described in the first embodiment.
Although the embodiments of the present invention have been described with reference to the accompanying drawings, it is not intended to limit the scope of the present invention, and it should be understood by those skilled in the art that various modifications and variations can be made without inventive efforts by those skilled in the art based on the technical solution of the present invention.

Claims (10)

1. The utility model provides a control method of accurate landing of guide unmanned aerial vehicle which characterized in that includes:
acquiring images of a landing platform at set intervals in the landing process of the unmanned aerial vehicle;
detecting the brightness and contrast of the image, judging whether a set threshold value is met, and if so, directly carrying out image recognition; otherwise, carrying out image equalization processing by using an image correction algorithm;
acquiring the position of the landing platform visual mark relative to the unmanned aerial vehicle under the camera coordinate system, and further determining the position information of the unmanned aerial vehicle in the landing platform visual mark coordinate system;
based on the position information and the error of the expected landing point position, the position of the unmanned aerial vehicle is adjusted by considering external disturbance until the error is adjusted to the set range, and the unmanned aerial vehicle is controlled to land.
2. The control method for guiding the unmanned aerial vehicle to land accurately according to claim 1, wherein the landing platform is provided with a light supplement device for improving the brightness of the visual mark of the landing platform;
before the unmanned aerial vehicle returns to the air above the airport landing platform and starts automatic landing, detecting the brightness of the acquired visual sign image, and if the image brightness is lower than a set first threshold value, starting a light supplementing device;
after the light supplementing equipment is started, detecting the brightness and the contrast of the visual sign image, and when the brightness of the visual sign image is lower than a set second threshold value, adjusting the brightness of the light supplementing equipment by adjusting the PWM duty ratio of a control signal so that the image brightness is not lower than the set second threshold value;
when the brightness signal adjustment value of the light supplement equipment is greater than the adjustable upper limit value or less than the adjustable lower limit value, the brightness of the light supplement equipment is not adjusted any more; if the brightness of the image does not meet the set threshold value, the image correction algorithm performs image equalization processing.
3. The control method for guiding the unmanned aerial vehicle to land accurately according to claim 1, wherein before the image of the landing platform is collected, calibration of the unmanned aerial vehicle camera is carried out; the specific process is as follows:
determining the number of angular points of the calibration board and the actual size of each checkerboard;
shooting calibration board pictures in different directions and angles by using an unmanned aerial vehicle camera to obtain a group of images;
detecting the characteristic points in the calibration board image, acquiring the pixel coordinates of the corner points of the calibration board, and calculating the physical coordinate values of the corner points of the calibration board according to the actual size of the checkerboard and the coordinates of a world coordinate system;
calculating and acquiring an internal reference matrix I and a distortion parameter matrix D of the camera according to the obtained relationship between the angular point physical coordinate values and the pixel coordinate values; and optimizing the parameter matrix I and the distortion parameter matrix D.
4. The control method for guiding the unmanned aerial vehicle to land accurately according to claim 1, wherein an image equalization process is performed by using an image correction algorithm, and the specific process is as follows:
converting the collected RGB image into a gray image, and calculating the pixel gray level of the gray image;
normalizing the image gray level, and calculating the cumulative distribution function of a gray histogram;
and performing image inverse transformation based on the cumulative distribution function to obtain an image after image equalization.
5. The control method for guiding the unmanned aerial vehicle to land accurately according to claim 1, wherein the position of the landing platform visual mark relative to the unmanned aerial vehicle under the camera coordinate system is obtained, and then the position information of the unmanned aerial vehicle in the landing platform visual mark coordinate system is determined, and the specific process is as follows:
acquiring external parameters of a camera based on an internal reference matrix calibrated by an unmanned aerial vehicle camera to obtain a rotation matrix R and a translation matrix T;
acquiring coordinates of a visual mark of the landing platform under a camera coordinate system, and establishing a coordinate conversion relation of the unmanned aerial vehicle under the camera coordinate system and the visual mark coordinate system of the landing platform by using the rotation matrix R and the translation matrix T;
and calculating to obtain the coordinates of the unmanned aerial vehicle under the coordinate system of the visual mark of the landing platform based on the coordinates of the unmanned aerial vehicle under the coordinate system of the camera.
6. The control method for guiding the unmanned aerial vehicle to land accurately according to claim 1, wherein the position of the unmanned aerial vehicle is adjusted by considering external disturbance based on the error between the position information and the expected landing point position, and the specific process includes:
determining the landing point position of the unmanned aerial vehicle in the visual landing platform, and acquiring the coordinates of the landing point position in a landing platform visual marker coordinate system;
calculating a landing error based on the coordinates of the unmanned aerial vehicle in the landing platform visual marker coordinate system and the coordinates of the landing point position in the landing platform visual marker coordinate system;
and determining the control quantity of the landing error through PID control, adding a control compensation item in the control quantity, and adjusting the position of the unmanned aerial vehicle by using the obtained control quantity.
7. The control method for guiding the unmanned aerial vehicle to land accurately according to claim 6, wherein the control compensation term is determined according to the position error calculated by the pose calculation at the current time and the position error calculated by the pose calculation at the previous time.
8. The utility model provides a control system that guide unmanned aerial vehicle precision was descended which characterized in that includes:
the image acquisition module is used for acquiring images of the landing platform at set intervals in the landing process of the unmanned aerial vehicle;
the image preprocessing module is used for detecting the brightness and the contrast of the image, judging whether a set threshold value is met, and if the set threshold value is met, directly identifying the image; otherwise, carrying out image equalization processing by using an image correction algorithm;
the position conversion module is used for acquiring the position of the visual mark of the landing platform relative to the unmanned aerial vehicle under the camera coordinate system, and further determining the position information of the unmanned aerial vehicle in the visual mark coordinate system of the landing platform;
and the error control module is used for adjusting the position of the unmanned aerial vehicle based on the position information and the error of the expected landing point position by considering external disturbance until the error is adjusted to a set range, and controlling the unmanned aerial vehicle to land.
9. A terminal device comprising a processor and a memory, the processor being arranged to implement instructions; the memory is used for storing a plurality of instructions, wherein the instructions are suitable for being loaded by the processor and executing the control method for guiding the unmanned aerial vehicle to accurately land according to any one of claims 1-7.
10. A computer-readable storage medium having stored thereon a plurality of instructions, wherein the instructions are adapted to be loaded by a processor of a terminal device and to execute the control method for guiding a precise landing of a drone of any one of claims 1 to 7.
CN202211233346.7A 2022-10-10 2022-10-10 Control method and system for guiding unmanned aerial vehicle to land accurately Pending CN115542941A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211233346.7A CN115542941A (en) 2022-10-10 2022-10-10 Control method and system for guiding unmanned aerial vehicle to land accurately

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211233346.7A CN115542941A (en) 2022-10-10 2022-10-10 Control method and system for guiding unmanned aerial vehicle to land accurately

Publications (1)

Publication Number Publication Date
CN115542941A true CN115542941A (en) 2022-12-30

Family

ID=84732884

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211233346.7A Pending CN115542941A (en) 2022-10-10 2022-10-10 Control method and system for guiding unmanned aerial vehicle to land accurately

Country Status (1)

Country Link
CN (1) CN115542941A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117409263A (en) * 2023-12-15 2024-01-16 成都时代星光科技有限公司 Unmanned aerial vehicle automatic image correction guiding landing method and computer storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117409263A (en) * 2023-12-15 2024-01-16 成都时代星光科技有限公司 Unmanned aerial vehicle automatic image correction guiding landing method and computer storage medium
CN117409263B (en) * 2023-12-15 2024-04-05 成都时代星光科技有限公司 Unmanned aerial vehicle automatic image correction guiding landing method and computer storage medium

Similar Documents

Publication Publication Date Title
US11314979B2 (en) Method and apparatus for evaluating image acquisition accuracy, electronic device and storage medium
CN112906529B (en) Face recognition light supplementing method, device, face recognition equipment and system thereof
CN106546263B (en) A kind of laser leveler shoot laser line detecting method based on machine vision
CN108229587B (en) Autonomous transmission tower scanning method based on hovering state of aircraft
CN108919830A (en) A kind of flight control method that unmanned plane precisely lands
CN108564629A (en) A kind of scaling method and system of vehicle-mounted camera external parameter
CN110086995B (en) Image brightness adjusting method and device and unmanned aerial vehicle
CN111932504B (en) Edge contour information-based sub-pixel positioning method and device
CN115542941A (en) Control method and system for guiding unmanned aerial vehicle to land accurately
CN110400278A (en) A kind of full-automatic bearing calibration, device and the equipment of color of image and geometric distortion
CN107068042B (en) Image processing method
CN114708326A (en) Full-automatic camera calibration system and method for adaptively adjusting brightness and ambiguity
CN112461845A (en) Solar cell panel rainbow texture detection method based on double-light integration
US8218877B2 (en) Tracking vehicle method by using image processing
CN112990034A (en) Traffic sign change detection method with priority image
JP5080416B2 (en) Image processing apparatus for detecting an image of a detection object from an input image
CN115984360B (en) Method and system for calculating length of dry beach based on image processing
CN116402784A (en) Auxiliary centering method, system, equipment and storage medium based on machine vision
CN116736259A (en) Laser point cloud coordinate calibration method and device for tower crane automatic driving
CN113610782B (en) Building deformation monitoring method, equipment and storage medium
CN115484712A (en) Control method and device for tunnel entrance lighting system and storage medium thereof
CN113689389A (en) Crop height measuring method, and gear adjusting method and device of plant protection machine
CN112712511A (en) Air tightness detection method and device based on computer vision and artificial intelligence
CN111563883A (en) Screen visual positioning method, positioning device and storage medium
CN106413280A (en) Automatic surface mounting machine feeder element position correction device and automatic correction method based on image processing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination