CN113222838A - Unmanned aerial vehicle autonomous line patrol method based on visual positioning - Google Patents

Unmanned aerial vehicle autonomous line patrol method based on visual positioning Download PDF

Info

Publication number
CN113222838A
CN113222838A CN202110495250.7A CN202110495250A CN113222838A CN 113222838 A CN113222838 A CN 113222838A CN 202110495250 A CN202110495250 A CN 202110495250A CN 113222838 A CN113222838 A CN 113222838A
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
line
image
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110495250.7A
Other languages
Chinese (zh)
Inventor
邵云峰
李闯
张涵羽
马中静
邹苏郦
刘永强
范益民
任海鹏
魏炜
李斌
任津京
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Luliang Power Supply Co of State Grid Shanxi Electric Power Co Ltd
Original Assignee
Beijing Institute of Technology BIT
Luliang Power Supply Co of State Grid Shanxi Electric Power Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT, Luliang Power Supply Co of State Grid Shanxi Electric Power Co Ltd filed Critical Beijing Institute of Technology BIT
Priority to CN202110495250.7A priority Critical patent/CN113222838A/en
Publication of CN113222838A publication Critical patent/CN113222838A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/168Segmentation; Edge detection involving transform domain methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20028Bilateral filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20061Hough transform

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention relates to an unmanned aerial vehicle autonomous line patrol method based on visual positioning, and belongs to the technical field of unmanned aerial vehicle navigation and control. According to the method, a camera is used for obtaining an image, and current accurate positioning of the unmanned aerial vehicle is obtained through image preprocessing, feature extraction and visual positioning; and the navigation of the unmanned aerial vehicle is completed by using the position information obtained by visual positioning through a designed position control algorithm, so that the autonomous line patrol task of the unmanned aerial vehicle is realized. The problems that the positioning error of the traditional inertial navigation system accumulates along with time, the GPS navigation system has poor autonomous performance and is easy to be interfered and the like are solved, and the inertial navigation system has the advantages of rapid identification, high positioning precision, strong stability and the like; the unmanned aerial vehicle can replace manual remote control operation, realizes the autonomous inspection of the unmanned aerial vehicle according to a preset route during long voyage, and has the advantages of autonomy, reliability and safety. The invention provides a new platform for the inspection work of the overhead transmission line.

Description

Unmanned aerial vehicle autonomous line patrol method based on visual positioning
Technical Field
The invention relates to an unmanned aerial vehicle autonomous line patrol method based on visual positioning, and belongs to the technical field of unmanned aerial vehicle navigation and control.
Background
With the improvement of the living standard of China and the progress of society, the kilometers of transmission and distribution networks in China are continuously increased. Meanwhile, the transmission and distribution network lines are distributed in a cross-region mode, so that the points are multiple and wide, the landform is complex, and the environment is severe. And the transmission line is exposed in the field for a long time, and is influenced by factors such as material aging, lightning flashover, ice coating, tree barriers, bird damage, continuous mechanical tension, artificial buildings and the like, so that the phenomena of abrasion, collapse, strand breakage, corrosion and the like are easily generated. If the equipment state diagnosis and early warning are not carried out in time, the safe operation and stability of the power grid are influenced. Therefore, the inspection and the equipment state early warning of the transmission and distribution network line are important rings for ensuring the safe and stable operation of the power grid.
Compare in modes such as traditional artifical patrolling and examining, fixed wing unmanned aerial vehicle patrols and examines and the helicopter is patrolled and examined, many rotor unmanned aerial vehicle patrol and examine because it can hover, patrol and examine efficient, characteristics such as autonomy is strong, the human cost is low have obtained wide application in power line patrols and examines. The key to realize the autonomous flight and line patrol of the unmanned aerial vehicle is to realize high-precision positioning and navigation, and the existing unmanned aerial vehicle mainly depends on an Inertial Navigation System (INS) and a Global Positioning System (GPS) to navigate. However, the inertial navigation system has the disadvantages of fast accumulation of positioning errors along with time, over-sensitivity to initial values and the like, and the high-precision inertial navigation system not only has large weight and volume, but also has high manufacturing cost; the GPS navigation system has poor autonomous performance, is easy to be interfered, has low information updating rate and has larger error.
Disclosure of Invention
The invention aims to solve the problem that autonomous inspection is difficult to realize due to low navigation and positioning accuracy, poor stability and other reasons when a multi-rotor unmanned aerial vehicle is used for power line inspection, and provides an unmanned aerial vehicle autonomous line inspection method based on visual positioning. The method comprises the steps that a camera is used for obtaining an image, and current accurate positioning of the unmanned aerial vehicle is obtained through image preprocessing, feature extraction and visual positioning; and the navigation of the unmanned aerial vehicle is completed by using the position information obtained by visual positioning through a designed position control algorithm, so that the autonomous line patrol task of the unmanned aerial vehicle is realized.
The purpose of the invention is realized by the following technical scheme:
the method comprises the following steps: and image preprocessing, namely performing graying and filtering processing on the original image acquired by the camera.
Firstly, graying the color image, and carrying out weighted average on the three components by different weights according to importance and other indexes by adopting a weighted average method. Because human eyes have highest sensitivity to green and lowest sensitivity to blue, a reasonable gray image can be obtained by carrying out weighted average on RGB three components according to the following formula.
Gray(i,j)=0.299R(i,j)+0.578G(i,j)+0.114B(i,j) (1)
R, G, B, where Gray is the Gray level value and Gray is the red, green, and blue components of each pixel in the image, and (i, j) is the coordinates of the pixel.
An image or video obtained from a vision sensor is susceptible to noise or background, and needs to be filtered, in order to ensure that the edge information of the image is protected while filtering out interference, a bilateral filtering method is adopted, and a filtering result BF can be expressed as:
Figure BDA0003054130200000021
wherein IpIs a pixel value, GSIs a spatial distance weight, which can be expressed as:
Figure BDA0003054130200000022
Gris the pixel value weight, which can be expressed as:
Figure BDA0003054130200000023
Wqfor the sum of the weights of each pixel value in the filtering window, the normalization of the weights can be expressed as:
Figure BDA0003054130200000024
in flat region, G of each pixel point in filterrClose value, spatial distance weight GSThe filtering effect is dominant. In the edge region, G on the same side as the edgerClose in value and far greater than G on the other side of the edgerAnd the weight of the pixel point on the other side hardly influences the filtering result at the moment, and the edge information is protected. When noise points appear in the flat area, the weight values of signals around the noise points are all small, and after normalization,these weights are boosted and therefore also have a filtering effect on noise points.
Step two: and (4) image feature extraction, including linear detection and fusion, to obtain the contour lines of the power transmission line and the tower.
Through comparative analysis, an FLD (flash line detection) algorithm is selected for line segment extraction, and compared with the traditional Hough transform and LSD linear detection, the FLD has the advantages of good recognition effect and high speed, and is more suitable for the rapidity requirement of visual positioning.
The FLD straight line detection algorithm can identify all straight line segments in the image, in order to extract the required target characteristics, a straight line fusion algorithm needs to be designed, irrelevant line segments are removed by utilizing the angle and the length of an identified line segment, and then the required straight line is obtained by grouping and fitting.
The FLD straight line detection algorithm obtains the result of two end point coordinates of all line segments in the image, polar coordinates (rho, theta) and length of each line segment in a pixel coordinate system are calculated by using the end point coordinates, and because the target features are lines and towers, the angles of the line segments are known, irrelevant line segments with inconsistent angles are removed firstly, and only line segments close to the target angles are reserved. Then dividing the segments into a plurality of groups according to different rho sizes, calculating the sum of the lengths of the line segments of each group, and removing the groups with too small lengths to obtain the target straight line.
Step three: and (4) visual positioning, wherein the current position of the unmanned aerial vehicle is calculated by using the target characteristics obtained by fusion.
The camera imaging process is actually a process of projecting an object in a three-dimensional world to a two-dimensional imaging plane, and according to the camera imaging principle, a formula of projecting the three-dimensional object to the imaging plane under a camera coordinate system is firstly obtained:
Figure BDA0003054130200000031
Figure BDA0003054130200000032
wherein, Oxy is an image coordinate system,i.e. the image plane, OCXCYCZCA camera coordinate system with an origin at the optical center of the lens, f being the focal length of the camera, and (X, y) being the coordinates of the next point in the image coordinate system, (X)C,YC,ZC) Is the coordinates of the next point in the camera coordinate system.
Since the image is stored in the computer in a pixel mode, a pixel coordinate system is required to reflect the arrangement condition of the pixels in the CCD/CMOS chip of the camera, the origin is positioned at the upper left corner of the image, the u axis and the v axis are respectively parallel to two vertical edges of an image surface, the unit is the pixel, and the image coordinate system can be converted into the pixel coordinate system according to the following formula:
Figure BDA0003054130200000033
Figure BDA0003054130200000034
wherein d isx、dyFor camera reference, the physical size of the pixel in the x-axis and y-axis directions is shown (u)0,v0) As principal point (image origin) coordinates.
The principle of visual positioning is just opposite to the imaging process of a camera, the position information of a three-dimensional world is reversely deduced from a pixel coordinate system, when a camera is over against a circuit, two shot images are two horizontal straight lines, the pixel coordinate system is firstly converted into an image coordinate system, and the formula is as follows:
y1=v1dy-vody
y2=v2dy-vody (8)
wherein y is1、y2Respectively the ordinate, v, of the two lines in the image coordinate system1、v2Respectively, the vertical coordinates of the two lines in the pixel coordinate system.
Then, according to the position relation between the imaging principle and the two power transmission lines, an equation set is obtained:
Figure BDA0003054130200000041
wherein d is the horizontal distance between two transmission lines, h is the vertical distance between two transmission lines, Y1、Y2Respectively the horizontal distance, Z, between the unmanned aerial vehicle and the two transmission lines1、Z2The vertical distances between the unmanned aerial vehicle and the two power transmission lines are respectively.
Finishing to obtain:
Figure BDA0003054130200000042
therefore, only two power transmission lines with known position relations are needed, the camera is guaranteed to be opposite to the line, the horizontal and vertical distances between the unmanned aerial vehicle and the power transmission lines can be achieved through the visual positioning algorithm, and the safety distance of the unmanned aerial vehicle in the line patrol process can be guaranteed.
When unmanned aerial vehicle shoots the shaft tower, can utilize straight line discernment and fusion, obtain two contour lines of shaft tower to calculate the vertical center line of shaft tower, utilize the position of shaft tower center line in the pixel coordinate system, can arrive the position of unmanned aerial vehicle at the direction of advance (parallel with the line direction):
Figure BDA0003054130200000043
wherein X is the distance between unmanned aerial vehicle and the perpendicular central line of shaft tower along the transmission line direction, and X is the abscissa of the perpendicular central line of shaft tower under the image coordinate system.
Step four: and a position controller is designed to realize the accurate navigation and autonomous line patrol task of the unmanned aerial vehicle.
The unmanned aerial vehicle that utilizes visual positioning to obtain is as position feedback for transmission line's accurate positional information, designs position controller, makes unmanned aerial vehicle carry out autonomous flight according to the route of expecting.
Firstly, a control node sends a take-off instruction, the unmanned aerial vehicle takes off, the unmanned aerial vehicle is enabled to hover at a roughly line-patrol distance, the camera holder is guaranteed to be opposite to a power transmission line, a visual positioning node is started, Kalman filtering is carried out on a visual positioning output result, and the horizontal and vertical distances between the unmanned aerial vehicle and the power transmission line are obtained and serve as initial values. And then entering a first stage, controlling the unmanned aerial vehicle to fly at a constant speed along the power transmission line by using the inertial navigation information, ensuring that the distance between the unmanned aerial vehicle and the power transmission line is constant and within a safety range, simultaneously calculating and accumulating the difference value between the visual positioning result and the inertial navigation information, judging whether the accumulated error is greater than a set threshold value at regular time intervals, if so, correcting the inertial navigation information, otherwise, continuously accumulating the error and judging next time. And when the unmanned aerial vehicle is close to the tower, the unmanned aerial vehicle enters a second stage, the camera captures the tower and extracts the vertical central line of the tower, the visual positioning in the direction parallel to the power transmission line is obtained, inertial navigation information is corrected, the unmanned aerial vehicle is controlled to shoot the tower at the position point of the set point, the unmanned aerial vehicle continues to fly forwards after shooting is completed, and the first stage and the second stage are repeatedly carried out until the routing inspection task of the power transmission line and the tower is completed.
Has the advantages that:
1. the visual positioning algorithm designed by the invention solves the problems that the positioning error of the traditional inertial navigation system accumulates along with time, the GPS navigation system has poor autonomous performance and is easy to be interfered, and the like, and has the advantages of rapid identification, high positioning precision, strong stability and the like.
2. The autonomous navigation line patrol algorithm designed by the invention can replace manual remote control operation, realizes autonomous patrol of the unmanned aerial vehicle according to a preset route during long voyage, and has autonomy, reliability and safety.
Drawings
FIG. 1 is a bilateral filtering method processing result;
FIG. 2 is a camera imaging principle;
FIG. 3 is a diagram of the transformation relationship between the image coordinate system and the pixel coordinate system;
FIG. 4 is a target straight line in a pixel coordinate system;
FIG. 5 is an imaging of two parallel power lines;
fig. 6 is an overall control flow of unmanned aerial vehicle transmission line inspection.
Detailed Description
The present invention will be described in detail below with reference to the accompanying drawings and examples. The technical problems and the advantages solved by the technical solutions of the present invention are also described, and it should be noted that the described embodiments are only intended to facilitate the understanding of the present invention, and do not have any limiting effect.
In the embodiment, the transmission line and the tower are selected as target features, all straight line segments are extracted from the image by using an FLD straight line detection algorithm, and the vertical center lines of the two transmission lines and the tower are obtained through feature fusion.
After the required target characteristics are obtained, a corresponding visual positioning algorithm is designed by utilizing a camera imaging principle, the position information of the unmanned aerial vehicle relative to the power transmission line and the tower at present is obtained, and finally, a position controller is designed to realize the accurate navigation and the autonomous line patrol task of the unmanned aerial vehicle.
An unmanned aerial vehicle autonomous line patrol method based on visual positioning comprises the following steps:
the method comprises the following steps: and image preprocessing, namely performing graying and filtering processing on the original image acquired by the camera.
Firstly, carrying out gray processing on a color image, adopting a weighted average method to carry out weighted average on three components according to different weights according to importance and other indexes because human eyes have highest sensitivity to green and lowest sensitivity to blue.
The image or video obtained from the vision sensor is susceptible to noise or background, filtering processing is needed, in order to ensure that the edge information of the image is protected while filtering interference, a bilateral filtering method is adopted, the filtering result is shown in fig. 1, it can be seen that the bilateral filtering increases the pixel value weight on the basis of Gaussian filtering, and the difference of the pixel values at two sides of the edge is considered while the pixel space distribution is considered, so that the edge information in the image can be well retained while the noise is effectively filtered.
Step two: and (4) image feature extraction, including linear detection and fusion, to obtain the contour lines of the power transmission line and the tower.
Common straight line extraction algorithms include Hough straight line transformation, LSD straight line detection and FLD straight line detection, the same image is used for testing the three straight line extraction algorithms, and the results are shown in the following table:
Figure BDA0003054130200000061
it can be seen that the detection effect of Hough line transformation is the worst among the three line detection algorithms, a lot of line information in the image is lost, and the consumed time is the longest; the identification effects of the FLD linear detection algorithm and the LSD linear detection algorithm are not different much, the identification effects are far better than Hough linear transformation, target linear information is well extracted, compared with the target linear information, the time consumed by the FLD linear detection algorithm for processing an image is short, the time is about half of that of the LSD linear detection algorithm, and the extraction efficiency is high. Through analysis, the FLD linear detection algorithm is selected, so that the method has the advantages of good recognition effect and high speed, and is more suitable for the rapidity requirement of visual positioning.
The FLD straight line detection algorithm can identify all straight line segments in the image, in order to extract the required target characteristics, a straight line fusion algorithm needs to be designed, irrelevant line segments are removed by utilizing the angle and the length of an identified line segment, and then the required straight line is obtained by grouping and fitting.
The FLD straight line detection algorithm obtains the result of two end point coordinates of all line segments in the image, polar coordinates (rho, theta) and length of each line segment in a pixel coordinate system are calculated by using the end point coordinates, and because the target features are lines and towers, the angles of the line segments are known, irrelevant line segments with inconsistent angles are removed firstly, and only line segments close to the target angles are reserved. Then dividing the segments into a plurality of groups according to different rho sizes, calculating the sum of the lengths of the line segments of each group, and removing the groups with too small lengths to obtain the target straight line.
Step three: and (4) visual positioning, wherein the current position of the unmanned aerial vehicle is calculated by using the target characteristics obtained by fusion.
The camera imaging process is actually a process of projecting an object in a three-dimensional world onto a two-dimensional imaging plane, and a formula of projecting the three-dimensional object onto the imaging plane in a camera coordinate system is obtained according to a camera imaging principle shown in fig. 2:
Figure BDA0003054130200000071
Figure BDA0003054130200000072
again, according to the transformation relationship shown in fig. 3, the image coordinate system can be transformed to the pixel coordinate system according to the following formula:
Figure BDA0003054130200000073
Figure BDA0003054130200000074
the principle of visual positioning is just opposite to the imaging process of a camera, the purpose is to reversely deduce the position information of the three-dimensional world from the pixel coordinate system, when the camera is over against the line, the shot image is as shown in fig. 4, which is two horizontal straight lines, firstly, the pixel coordinate system is converted into the image coordinate system, and the formula is as follows:
y1=v1dy-vody
y2=v2dy-vody (3)
then, according to the imaging principle shown in fig. 5 and the position relationship between the two transmission lines, a system of equations is obtained:
Figure BDA0003054130200000075
finishing to obtain:
Figure BDA0003054130200000076
therefore, only two power transmission lines with known position relations are needed, the camera is guaranteed to be opposite to the line, the horizontal and vertical distances between the unmanned aerial vehicle and the power transmission lines can be achieved through the visual positioning algorithm, and the safety distance of the unmanned aerial vehicle in the line patrol process can be guaranteed.
When unmanned aerial vehicle shoots the shaft tower, can utilize straight line discernment and fusion, obtain two contour lines of shaft tower to calculate the vertical center line of shaft tower, utilize the position of shaft tower center line in the pixel coordinate system, can arrive the position of unmanned aerial vehicle at the direction of advance (parallel with the line direction):
Figure BDA0003054130200000081
wherein X is the distance between unmanned aerial vehicle and the perpendicular central line of shaft tower along the transmission line direction, and X is the abscissa of the perpendicular central line of shaft tower under the image coordinate system.
A power transmission line model is set up to carry out a material object verification experiment on the visual positioning method, and Y-axis and Z-axis visual positioning experiment results are shown in the following table:
Figure BDA0003054130200000082
the X-axis visual positioning test results are shown in the following table:
Figure BDA0003054130200000083
as can be seen from the table, the visual positioning of the unmanned aerial vehicle inspection system can reach centimeter-level precision, the maximum error is about 10 centimeters, the navigation precision requirement of the unmanned aerial vehicle power inspection can be met, high-precision positioning information is stopped, and the safety distance in the unmanned aerial vehicle inspection process is ensured.
Step four: and a position controller is designed to realize the accurate navigation and autonomous line patrol task of the unmanned aerial vehicle.
The unmanned aerial vehicle that utilizes visual positioning to obtain is as position feedback for transmission line's accurate positional information, designs position controller, makes unmanned aerial vehicle carry out autonomous flight according to the route of expecting.
The overall control flow is as shown in fig. 6, firstly, the control node sends a takeoff instruction, the unmanned aerial vehicle takes off, the unmanned aerial vehicle is enabled to hover at an approximate line patrol distance, the camera holder is guaranteed to be opposite to the power transmission line, the visual positioning node is started, Kalman filtering is carried out on a visual positioning output result, and the horizontal and vertical distances between the unmanned aerial vehicle and the power transmission line are obtained and used as initial values. And then entering a first stage, controlling the unmanned aerial vehicle to fly at a constant speed along the power transmission line by using the inertial navigation information, ensuring that the distance between the unmanned aerial vehicle and the power transmission line is constant and within a safety range, simultaneously calculating and accumulating the difference value between the visual positioning result and the inertial navigation information, judging whether the accumulated error is greater than a set threshold value at regular time intervals, if so, correcting the inertial navigation information, otherwise, continuously accumulating the error and judging next time. And when the unmanned aerial vehicle is close to the tower, the unmanned aerial vehicle enters a second stage, the camera captures the tower and extracts the vertical central line of the tower, the visual positioning in the direction parallel to the power transmission line is obtained, inertial navigation information is corrected, the unmanned aerial vehicle is controlled to shoot the tower at the position point of the set point, the unmanned aerial vehicle continues to fly forwards after shooting is completed, and the first stage and the second stage are repeatedly carried out until the routing inspection task of the power transmission line and the tower is completed.
The above detailed description is intended to illustrate the objects, aspects and advantages of the present invention, and it should be understood that the above detailed description is only exemplary of the present invention and is not intended to limit the scope of the present invention, and any modifications, equivalents, improvements and the like within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (5)

1. An unmanned aerial vehicle autonomous line patrol method based on visual positioning is characterized by comprising the following steps:
the method comprises the following steps: image preprocessing, namely performing graying and filtering processing on an original image acquired by a camera;
step two: extracting image characteristics, including linear detection and fusion, to obtain the contour lines of the power transmission line and the tower;
step three: visual positioning, namely calculating the current position of the unmanned aerial vehicle by using the target characteristics obtained by fusion;
step four: and a position controller is designed to realize the accurate navigation and autonomous line patrol task of the unmanned aerial vehicle.
2. The unmanned aerial vehicle autonomous line patrol method based on visual positioning as claimed in claim 1, wherein the implementation method of the first step is:
firstly, carrying out gray processing on a color image, and carrying out weighted average on three components according to different weights by adopting a weighted average method according to importance and other indexes; because human eyes have highest sensitivity to green and lowest sensitivity to blue, a reasonable gray image can be obtained by carrying out weighted average on RGB three components according to the following formula;
Gray(i,j)=0.299R(i,j)+0.578G(i,j)+0.114B(i,j) (1)
r, G, B, wherein, Gray is the Gray value, and (i, j) is the coordinate of the pixel point;
an image or video obtained from a vision sensor is susceptible to noise or background, and needs to be filtered, in order to ensure that the edge information of the image is protected while filtering out interference, a bilateral filtering method is adopted, and a filtering result BF can be expressed as:
Figure FDA0003054130190000011
wherein IpIs a pixel value, GSIs a spatial distance weight, which can be expressed as:
Figure FDA0003054130190000012
Gris the pixel value weight, which can be expressed as:
Figure FDA0003054130190000013
Wqfor the sum of the weights of each pixel value in the filtering window, the normalization of the weights can be expressed as:
Figure FDA0003054130190000014
in flat region, G of each pixel point in filterrClose value, spatial distance weight GSLeading the filtering effect; in the edge region, G on the same side as the edgerClose in value and far greater than G on the other side of the edgerThe value is obtained, the weight of the pixel point on the other side hardly influences the filtering result at the moment, and the edge information is protected; when the noise points appear in the flat area, the weights of signals around the noise points are small, and after normalization, the weights are improved, so that the noise points are also filtered.
3. The unmanned aerial vehicle autonomous line patrol method based on visual positioning as claimed in claim 1, wherein the implementation method of step two is:
through comparative analysis, an FLD (flash line detection) algorithm is selected for line segment extraction, and compared with the traditional Hough transform and LSD linear detection, the FLD has the advantages of good recognition effect and high speed, and is more suitable for the rapidity requirement of visual positioning;
the FLD straight line detection algorithm can identify all straight line segments in the image, in order to extract the required target characteristics from the straight line segments, a straight line fusion algorithm needs to be designed, irrelevant line segments are removed by utilizing the angle and the length of an identified line segment, and then the straight line segments are grouped and fitted to obtain the required straight line;
the FLD straight line detection algorithm obtains the result that two end point coordinates of all line segments in the image are obtained, the polar coordinates (rho, theta) and the length of each line segment under a pixel coordinate system are calculated by utilizing the end point coordinates, and because the target characteristics are lines and towers, the angles of the line segments are known, irrelevant line segments with inconsistent angles are removed firstly, and only line segments close to the target angles are reserved; then dividing the segments into a plurality of groups according to different rho sizes, calculating the sum of the lengths of the line segments of each group, and removing the groups with too small lengths to obtain the target straight line.
4. The unmanned aerial vehicle autonomous line patrol method based on visual positioning as claimed in claim 1, wherein the implementation method of step three is:
the camera imaging process is actually a process of projecting an object in a three-dimensional world to a two-dimensional imaging plane, and according to the camera imaging principle, a formula of projecting the three-dimensional object to the imaging plane under a camera coordinate system is firstly obtained:
Figure FDA0003054130190000021
Figure FDA0003054130190000022
wherein, Oxy is an image coordinate system, i.e. an imaging plane, OCXCYCZCA camera coordinate system with an origin at the optical center of the lens, f being the focal length of the camera, and (X, y) being the coordinates of the next point in the image coordinate system, (X)C,YC,ZC) The coordinates of the next point in the camera coordinate system;
since the image is stored in the computer in a pixel mode, a pixel coordinate system is required to reflect the arrangement condition of the pixels in the CCD/CMOS chip of the camera, the origin is positioned at the upper left corner of the image, the u axis and the v axis are respectively parallel to two vertical edges of an image surface, the unit is the pixel, and the image coordinate system can be converted into the pixel coordinate system according to the following formula:
Figure FDA0003054130190000023
Figure FDA0003054130190000024
wherein d isx、dyFor camera reference, the physical size of the pixel in the x-axis and y-axis directions is shown (u)0,v0) As principal point (image origin) coordinates;
the principle of visual positioning is just opposite to the imaging process of a camera, the position information of a three-dimensional world is reversely deduced from a pixel coordinate system, when a camera is over against a circuit, two shot images are two horizontal straight lines, the pixel coordinate system is firstly converted into an image coordinate system, and the formula is as follows:
y1=v1dy-vody
y2=v2dy-vody (8)
wherein y is1、y2Respectively the ordinate, v, of the two lines in the image coordinate system1、v2Respectively are vertical coordinates of the two lines in a pixel coordinate system;
then, according to the position relation between the imaging principle and the two power transmission lines, an equation set is obtained:
Figure FDA0003054130190000031
wherein d is the horizontal distance between two transmission lines, h is the vertical distance between two transmission lines, Y1、Y2Respectively the horizontal distance, Z, between the unmanned aerial vehicle and the two transmission lines1、Z2The vertical distances between the unmanned aerial vehicle and the two power transmission lines are respectively; finishing to obtain:
Figure FDA0003054130190000032
therefore, only two power transmission lines with known position relations are needed, and the camera is guaranteed to be opposite to the lines, so that the horizontal and vertical distances between the unmanned aerial vehicle and the power transmission lines can be obtained through the visual positioning algorithm, and the safety distance of the unmanned aerial vehicle in the line patrol process can be guaranteed;
when unmanned aerial vehicle shoots the shaft tower, can utilize straight line discernment and fusion, obtain two contour lines of shaft tower to calculate the vertical center line of shaft tower, utilize the position of shaft tower center line in the pixel coordinate system, can arrive the position of unmanned aerial vehicle at the direction of advance (parallel with the line direction):
Figure FDA0003054130190000033
wherein X is the distance between unmanned aerial vehicle and the perpendicular central line of shaft tower along the transmission line direction, and X is the abscissa of the perpendicular central line of shaft tower under the image coordinate system.
5. The unmanned aerial vehicle autonomous line patrol method based on visual positioning as claimed in claim 1, wherein the implementation method of step four is:
the method comprises the steps that accurate position information of the unmanned aerial vehicle relative to a power transmission line, which is obtained through visual positioning, is used as position feedback, and a position controller is designed to enable the unmanned aerial vehicle to fly autonomously according to an expected route;
firstly, a control node sends a take-off instruction, and the unmanned aerial vehicle takes off to enable the unmanned aerial vehicle to hover at a roughly line-patrol distance and ensure that a camera holder is over against a power transmission line, a visual positioning node is started, Kalman filtering is carried out on a visual positioning output result, and the horizontal and vertical distances between the unmanned aerial vehicle and the power transmission line are obtained as initial values; then entering a first stage, controlling the unmanned aerial vehicle to fly at a constant speed along the power transmission line by using inertial navigation information, ensuring that the distance between the unmanned aerial vehicle and the power transmission line is constant and within a safety range, simultaneously calculating and accumulating the difference value between a visual positioning result and the inertial navigation information, judging whether the accumulated error is greater than a set threshold value at regular time intervals, if so, correcting the inertial navigation information, otherwise, continuously accumulating the error and judging next time; and when the unmanned aerial vehicle is close to the tower, the unmanned aerial vehicle enters a second stage, the camera captures the tower and extracts the vertical central line of the tower, the visual positioning in the direction parallel to the power transmission line is obtained, inertial navigation information is corrected, the unmanned aerial vehicle is controlled to shoot the tower at the position point of the set point, the unmanned aerial vehicle continues to fly forwards after shooting is completed, and the first stage and the second stage are repeatedly carried out until the routing inspection task of the power transmission line and the tower is completed.
CN202110495250.7A 2021-05-07 2021-05-07 Unmanned aerial vehicle autonomous line patrol method based on visual positioning Pending CN113222838A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110495250.7A CN113222838A (en) 2021-05-07 2021-05-07 Unmanned aerial vehicle autonomous line patrol method based on visual positioning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110495250.7A CN113222838A (en) 2021-05-07 2021-05-07 Unmanned aerial vehicle autonomous line patrol method based on visual positioning

Publications (1)

Publication Number Publication Date
CN113222838A true CN113222838A (en) 2021-08-06

Family

ID=77091526

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110495250.7A Pending CN113222838A (en) 2021-05-07 2021-05-07 Unmanned aerial vehicle autonomous line patrol method based on visual positioning

Country Status (1)

Country Link
CN (1) CN113222838A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114582040A (en) * 2022-05-05 2022-06-03 中国长江三峡集团有限公司 Intelligent inspection system and method for wind power generation equipment
CN114689030A (en) * 2022-06-01 2022-07-01 中国兵器装备集团自动化研究所有限公司 Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103855644A (en) * 2014-03-14 2014-06-11 刘凯 Multi-rotary-wing intelligent inspection robot for overhead line
CN106934832A (en) * 2017-03-23 2017-07-07 电子科技大学 A kind of simple straight line automatic positioning method towards vision line walking
US20170313439A1 (en) * 2016-04-29 2017-11-02 Jordan Holt Methods and syststems for obstruction detection during autonomous unmanned aerial vehicle landings
CN108413964A (en) * 2018-03-08 2018-08-17 云南电网有限责任公司电力科学研究院 A kind of unmanned plane polling transmission line path planning method and system
CN110133440A (en) * 2019-05-27 2019-08-16 国电南瑞科技股份有限公司 Electric power unmanned plane and method for inspecting based on Tower Model matching and vision guided navigation
CN110395398A (en) * 2019-09-05 2019-11-01 广东电网有限责任公司 A kind of ground connection assembly system and its earthing method based on multi-rotor unmanned aerial vehicle
CN110687904A (en) * 2019-12-09 2020-01-14 广东科凯达智能机器人有限公司 Visual navigation routing inspection and obstacle avoidance method for inspection robot
CN111275015A (en) * 2020-02-28 2020-06-12 广东电网有限责任公司 Unmanned aerial vehicle-based power line inspection electric tower detection and identification method and system
CN111404083A (en) * 2020-04-28 2020-07-10 国网湖南省电力有限公司 Power transmission line inspection robot based on comprehensive navigation and line inspection method thereof
JP6852936B1 (en) * 2019-11-15 2021-03-31 広東工業大学Guangdong University Of Technology Drone visual odometer method based on depth dotted line features
CN112613334A (en) * 2020-06-23 2021-04-06 辽宁工程技术大学 High-precision autonomous inspection image identification method of unmanned aerial vehicle on power transmission line

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103855644A (en) * 2014-03-14 2014-06-11 刘凯 Multi-rotary-wing intelligent inspection robot for overhead line
US20170313439A1 (en) * 2016-04-29 2017-11-02 Jordan Holt Methods and syststems for obstruction detection during autonomous unmanned aerial vehicle landings
CN106934832A (en) * 2017-03-23 2017-07-07 电子科技大学 A kind of simple straight line automatic positioning method towards vision line walking
CN108413964A (en) * 2018-03-08 2018-08-17 云南电网有限责任公司电力科学研究院 A kind of unmanned plane polling transmission line path planning method and system
CN110133440A (en) * 2019-05-27 2019-08-16 国电南瑞科技股份有限公司 Electric power unmanned plane and method for inspecting based on Tower Model matching and vision guided navigation
CN110395398A (en) * 2019-09-05 2019-11-01 广东电网有限责任公司 A kind of ground connection assembly system and its earthing method based on multi-rotor unmanned aerial vehicle
JP6852936B1 (en) * 2019-11-15 2021-03-31 広東工業大学Guangdong University Of Technology Drone visual odometer method based on depth dotted line features
CN110687904A (en) * 2019-12-09 2020-01-14 广东科凯达智能机器人有限公司 Visual navigation routing inspection and obstacle avoidance method for inspection robot
CN111275015A (en) * 2020-02-28 2020-06-12 广东电网有限责任公司 Unmanned aerial vehicle-based power line inspection electric tower detection and identification method and system
CN111404083A (en) * 2020-04-28 2020-07-10 国网湖南省电力有限公司 Power transmission line inspection robot based on comprehensive navigation and line inspection method thereof
CN112613334A (en) * 2020-06-23 2021-04-06 辽宁工程技术大学 High-precision autonomous inspection image identification method of unmanned aerial vehicle on power transmission line

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
成建宏;: "机器视觉在输电线路巡检机器人中的应用综述", 自动化技术与应用, no. 04 *
王吉岱;郭帅;孙爱芹;付恩鹏;梁茂轩;杨帅;侯建国;: "基于双目视觉技术的高压输电线路巡检机器人在线测距", 科学技术与工程, no. 15 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114582040A (en) * 2022-05-05 2022-06-03 中国长江三峡集团有限公司 Intelligent inspection system and method for wind power generation equipment
CN114689030A (en) * 2022-06-01 2022-07-01 中国兵器装备集团自动化研究所有限公司 Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision

Similar Documents

Publication Publication Date Title
CN107729808B (en) Intelligent image acquisition system and method for unmanned aerial vehicle inspection of power transmission line
CN106326892B (en) Visual landing pose estimation method of rotary wing type unmanned aerial vehicle
CN110879601B (en) Unmanned aerial vehicle inspection method for unknown fan structure
Luo et al. A survey of intelligent transmission line inspection based on unmanned aerial vehicle
CN104865971B (en) The control method and unmanned plane of a kind of polling transmission line unmanned plane
CN109683629B (en) Unmanned aerial vehicle electric power overhead line system based on combination navigation and computer vision
CN106873627A (en) A kind of multi-rotor unmanned aerial vehicle and method of automatic detecting transmission line of electricity
CN105446351B (en) It is a kind of can lock onto target Qu Yu lookout the unmanned airship system based on independent navigation
CN112215860A (en) Unmanned aerial vehicle positioning method based on image processing
CN106708073B (en) A kind of quadrotor system of independent navigation power-line patrolling fault detection
CN113222838A (en) Unmanned aerial vehicle autonomous line patrol method based on visual positioning
CN106502257B (en) Anti-interference control method for precise landing of unmanned aerial vehicle
CN110618691B (en) Machine vision-based method for accurately landing concentric circle targets of unmanned aerial vehicle
CN109885086A (en) A kind of unmanned plane vertical landing method based on the guidance of multiple polygonal shape mark
CN106960454A (en) Depth of field barrier-avoiding method, equipment and unmanned vehicle
CN112068539A (en) Unmanned aerial vehicle automatic driving inspection method for blades of wind turbine generator
CN109976339B (en) Vehicle-mounted distribution network inspection data acquisition method and inspection system
CN112947569B (en) Visual servo target tracking control method for quad-rotor unmanned aerial vehicle based on preset performance
CN112040175A (en) Unmanned aerial vehicle inspection method and device, computer equipment and readable storage medium
CN114111799A (en) Unmanned aerial vehicle aerial photography path planning method aiming at high monomer fine modeling
CN110825098B (en) Unmanned aerial vehicle distribution network intelligent inspection system
CN110850889B (en) Unmanned aerial vehicle autonomous inspection system based on RTK navigation
CN113378701B (en) Ground multi-AGV state monitoring method based on unmanned aerial vehicle
CN109145905A (en) A kind of transmission line of electricity accessory detection method of view-based access control model conspicuousness
CN115686073B (en) Unmanned aerial vehicle-based transmission line inspection control method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220224

Address after: No.844, Longshan Road, Lishi District, Luliang City, Shanxi Province 033000

Applicant after: LVLIANG POWER SUPPLY COMPANY, STATE GRID SHANXI ELECTRIC POWER Co.

Address before: No.844, Longshan Road, Lishi District, Luliang City, Shanxi Province 033000

Applicant before: LVLIANG POWER SUPPLY COMPANY, STATE GRID SHANXI ELECTRIC POWER Co.

Applicant before: Beijing Institute of Technology