CN114217641A - Unmanned aerial vehicle power transmission and transformation equipment inspection method and system in non-structural environment - Google Patents

Unmanned aerial vehicle power transmission and transformation equipment inspection method and system in non-structural environment Download PDF

Info

Publication number
CN114217641A
CN114217641A CN202111274981.5A CN202111274981A CN114217641A CN 114217641 A CN114217641 A CN 114217641A CN 202111274981 A CN202111274981 A CN 202111274981A CN 114217641 A CN114217641 A CN 114217641A
Authority
CN
China
Prior art keywords
inspection
point
candidate
points
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111274981.5A
Other languages
Chinese (zh)
Other versions
CN114217641B (en
Inventor
马磊
王耀东
王勇
孟大鹏
缑培培
付治宇
左魁生
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
State Grid Henan Electric Power Co Zhengzhou Power Supply Co
Zhongmu Power Supply Co Of State Grid Henan Electric Power Co
State Grid Corp of China SGCC
Original Assignee
State Grid Henan Electric Power Co Zhengzhou Power Supply Co
Zhongmu Power Supply Co Of State Grid Henan Electric Power Co
State Grid Corp of China SGCC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by State Grid Henan Electric Power Co Zhengzhou Power Supply Co, Zhongmu Power Supply Co Of State Grid Henan Electric Power Co, State Grid Corp of China SGCC filed Critical State Grid Henan Electric Power Co Zhengzhou Power Supply Co
Priority to CN202111274981.5A priority Critical patent/CN114217641B/en
Publication of CN114217641A publication Critical patent/CN114217641A/en
Application granted granted Critical
Publication of CN114217641B publication Critical patent/CN114217641B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/12Target-seeking control

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a method and a system for inspecting unmanned aerial vehicle power transmission and transformation equipment in a non-structural environment, wherein S1, environmental information is acquired through a sensor; s2, extracting the characteristics of the environment information image, marking the power transmission and transformation equipment needing to be detected as a candidate inspection point, if the candidate inspection point exists, turning to S3, and if the candidate inspection point does not exist, turning to S4; s3, selecting the optimal inspection point from all the candidate inspection points, driving to the point, marking the point as inspected, and turning to S1; s4, the inspection robot inquires whether a candidate inspection point exists again, if so, the process goes to S3, and if not, the process goes to S5; s5, checking whether all the candidate inspection points are marked as inspected points, if unknown candidate inspection points exist, driving the inspection robot to the point, turning to S1, and if unknown candidate inspection points do not exist, finishing inspection; the optimal planning of the routing inspection path is realized, meanwhile, the unknown routing inspection environment can be explored in the routing inspection process, and therefore self-service routing inspection in the unknown non-structural environment is realized.

Description

Unmanned aerial vehicle power transmission and transformation equipment inspection method and system in non-structural environment
The technical field is as follows:
the invention relates to the field of power transmission and transformation equipment inspection, in particular to an unmanned aerial vehicle power transmission and transformation equipment inspection method and system in a non-structural environment.
Background art:
in order to ensure the stable supply of electric power, a maintainer needs to regularly inspect a power supply and transformation circuit to find the defects of power supply and transformation equipment, along with the development of an unmanned robot, the inspection mode is changed from manual inspection to machine inspection, the inspection mode of the machine greatly improves the inspection efficiency, reduces the labor intensity of people, and even can directly position and alarm the fault position of the defect, greatly improves the efficiency of the maintainer in processing the fault and ensures the stable supply of the electric power, wherein the machine inspection comprises aerial unmanned aerial vehicle inspection and land robot inspection, however, in any inspection mode, the inspection is carried out by manual real-time remote control command or inspection coordinates are input into the inspection robot in advance, the inspection robot carries out inspection one by one according to the input inspection coordinates, which is equivalent to that the inspection robot obtains the environment to be inspected, the automatic inspection condition of the inspection robot in the environment is not provided, and meanwhile, the inspection robot is provided with a power supply, manual remote control inspection and coordinate point inspection cannot guarantee that more power supply and transformation equipment are inspected under the condition of the same electric quantity loss, so that the inspection robot needs to return frequently to replace batteries, the waste of time and electric energy is caused, and the development of inspection by the inspection robot is inconvenient.
The invention content is as follows:
the technical problem to be solved by the invention is as follows: the unknown structural environment is sensed through the sensor, the optimal inspection point is selected from the power supply and transformation equipment which needs to be inspected to detect, optimal planning is carried out on the inspection path, meanwhile, the unknown inspection environment can be explored in the inspection process, self-service inspection in the unknown non-structural environment is achieved, and the problems of excessive power consumption and manpower resource waste caused by manual remote control inspection and leading-in inspection point inspection are solved.
In order to solve the technical problems, the invention provides a technical scheme that: an unmanned aerial vehicle power transmission and transformation equipment inspection method in a non-structural environment comprises the following steps: the method comprises the following steps that firstly, an unmanned aerial vehicle collects environmental information of power transmission and transformation equipment through a carried sensor;
step two, processing the obtained environment information image, extracting image features in the processed image, marking the power transmission and transformation equipment needing to be detected as a candidate inspection point, if the candidate inspection point exists in the exploration environment, skipping to step three, and if the candidate inspection point does not exist, skipping to step four;
selecting an optimal inspection point from all candidate inspection points, driving to the inspection point, marking the inspection point as inspected, and returning to the step one;
step four, the inspection robot inquires whether candidate inspection points exist again, if so, the inspection robot returns to the step three, and if not, the inspection robot jumps to the step five;
and step five, checking whether all the candidate patrol inspection points are marked as patrol inspection, if unknown candidate patrol inspection points exist, driving the patrol inspection robot to the point, and then jumping to the step one, if unknown candidate patrol inspection points do not exist. The inspection is ended.
Further, in the first step, the sensors are a three-dimensional camera SR-3000, a laser range finder and a gyroscope.
Further, in the second step, the image processing and feature extraction steps are as follows: 1) preprocessing the image to obtain image gray scale and three-dimensional information;
2) marking the ground, the sky and the distant view in the gray level image according to the threshold value, and deleting the sky, the distant view and the ground area;
3) carrying out binarization on the images for marking the sky and the ground, wherein the gray value of the region of interest is represented by a non-zero value, and the gray value of the region of non-interest is represented by zero;
4) clustering the extracted interesting regions in the gray level image by utilizing the three-dimensional information of the pixels, and separating the interesting regions from the non-interesting regions;
5) comparing two adjacent points of the current interest area in the image, if the distance between the two points is within a certain range, considering that the two data points belong to the same class, if the distance exceeds a threshold value, considering that the two points belong to different classes, and starting the next round of data comparison by taking the current data point as the starting point of the new class to finish secondary cluster analysis;
6) and determining and extracting edges through edge detection, and drawing the target object.
Further, the image preprocessing comprises the following steps: and eliminating noise generated in signal acquisition by adopting mean filtering or median filtering, and performing gray scale transformation on the image by adopting linear gray scale transformation, nonlinear gray scale transformation or piecewise linear gray scale transformation.
Further, the image marking step is as follows: 1) marking an area belonging to the sky on the image according to the height information of the pixel points of the image, and setting the gray value corresponding to the pixel point with the height value of the pixel larger than a set threshold value to zero;
2) marking the ground area on the image according to the height information of the pixel points of the image, and setting the gray value of the pixel point, which is smaller than the gray value corresponding to the threshold pixel point, to zero;
3) and marking a distant view area on the image according to the distance information of the image pixel points, and setting the gray value of the pixel point with the distance value larger than the corresponding gray value of the threshold pixel point to zero.
Furthermore, in the process of marking the ground, partial discontinuous points exist in the non-interested region, and the binary image is processed by corrosion and expansion operation to remove the discontinuous points.
Further, in the third step, the method for determining the optimal candidate inspection point includes: an MCDM (Multi-criterion Decision Making) Multi-index Decision Making system is adopted, a plurality of evaluation indexes of each candidate inspection point are comprehensively considered according to evaluation conditions to obtain evaluation values, the evaluation values of the candidate inspection points are compared, the candidate inspection point with the largest evaluation value is selected as an optimal inspection point, and the optimal inspection point is selected from the candidate inspection points.
Further, the evaluation condition is as follows: path loss, information gain, and rotation angle, wherein,
and (3) path consumption: in the process of one-time routing inspection, the routing inspection robot travels the distance from the current routing inspection point to the target routing inspection point;
information gain: the method comprises the steps that new environment information is obtained at a patrol point and comprises a newly obtained environment area and the length of a free boundary where the patrol point is located;
rotation angle: the rotation angle is the rotation angle required for the direction of the robot at the current position to reach the selected patrol point.
In order to solve the above technical problems, another technical solution provided by the present invention is: the utility model provides an unmanned aerial vehicle sends substation equipment system of patrolling and examining under non-structural environment which characterized in that: including perception module, autonomic module and mobile module, wherein:
a perception module: the sensor component is used for collecting environmental information of the power transmission and transformation equipment;
an autonomous module: processing the obtained environment information image, extracting image features in the processed image, marking power transmission and transformation equipment needing to be detected as candidate inspection points, selecting an optimal inspection point from all the candidate inspection points according to judgment conditions, and sending a control instruction for moving the optimal inspection point to the inspection point to a moving module;
a moving module: and receiving a control instruction sent by the main module, and adjusting corresponding power to enable the power to move to the optimal inspection point.
The invention has the beneficial effects that:
collecting environmental information of power transmission and transformation equipment through a sensor carried by an unmanned aerial vehicle; processing the obtained environment information image, extracting image features in the processed image, marking the power transmission and transformation equipment needing to be detected as a candidate patrol point, if the candidate patrol point exists in the exploration environment, skipping to the third step, and if the candidate patrol point does not exist, skipping to the fourth step; selecting an optimal inspection point from all candidate inspection points, driving to the inspection point, marking the inspection point as inspected, and returning to the step one; the inspection robot inquires whether candidate inspection points exist again, if so, the inspection robot returns to the third step, and if not, the inspection robot jumps to the fifth step; checking whether all the candidate inspection points are marked as inspected points, if unknown candidate inspection points exist, driving the inspection robot to the point, then jumping to the first step, and if unknown candidate inspection points do not exist, finishing inspection; the realization can also explore the unknown environment of patrolling and examining at the in-process of patrolling and examining when best planning to patrolling and examining the route to realize patrolling and examining by oneself in unknown non-structural environment, solve artifical remote control and patrol and examine and lead-in patrol and examine the problem that electric power excessive consumption and manpower resources are extravagant that the point patrols and examines and lead to.
The method comprises the steps of obtaining gray level images and depth information by a three-dimensional camera according to the characteristic that the color of a background in a non-structural environment is similar to that of power supply and transformation equipment, dividing the images into regions of interest and regions of no interest by using a three-dimensional information threshold value method based on the gray level information and the three-dimensional information, improving the image quality, removing redundant information without much value, carrying out secondary separation processing on the power supply and transformation equipment in the regions of interest, segmenting the images, ensuring that the power supply and transformation equipment in the regions of interest are mutually independent, and facilitating follow-up inspection point marking and smooth inspection.
The inspection robot is used for autonomous inspection in an unknown environment, the sensing mode of the unknown environment is mainly implemented by a sensor carried by the inspection robot in the inspection process, the sensing range of the sensor is limited, the inspection robot can only sense the environment around an inspection path, and therefore the process of inspecting power supply and transformation equipment is also the sensing process of the unknown environment.
Description of the drawings:
in order to more clearly illustrate the present invention or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments are briefly introduced below, it is obvious that the drawings in the following description are only two of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a flow chart of a method for inspecting power transmission and transformation equipment of an unmanned aerial vehicle in a non-structural environment;
FIG. 2 is a flowchart of the optimal patrol point acquisition step;
FIG. 3 is a first schematic diagram of a simulation experiment exploration process;
FIG. 4 is a schematic diagram of a simulation experiment exploration process II;
FIG. 5 is a third schematic diagram of a simulation experiment exploration process;
fig. 6 is a system connection block diagram of the unmanned aerial vehicle power transmission and transformation equipment inspection system in the non-structural environment.
The specific implementation mode is as follows:
in order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
Referring to the attached drawings 1-5, the application discloses an unmanned aerial vehicle power transmission and transformation equipment inspection method in an unstructured environment, which comprises the following steps: s1, the unmanned aerial vehicle collects environmental information of the power transmission and transformation equipment through a carried sensor;
step S2, processing the obtained environment information image, extracting image features in the processed image, marking the power transmission and transformation equipment needing to be detected as a candidate patrol point, if the candidate patrol point exists in the exploration environment, jumping to step S3, and if the candidate patrol point does not exist, jumping to step S4;
s3, selecting an optimal inspection point from all the candidate inspection points, driving to the inspection point, marking the inspection point as inspected, and returning to S1;
step S4, the inspection robot inquires whether a candidate inspection point exists again, if so, the inspection robot returns to S3, and if not, the inspection robot jumps to S5;
and S5, checking whether all the candidate patrol points are marked as patrol, if unknown candidate patrol points exist, driving the patrol robot to the point, then jumping to S1, and if unknown candidate patrol points do not exist, ending the patrol.
This application carries out the perception to unknown structural environment through self sensor to select the best point of patrolling and examining to detect in the power supply and transformation equipment that the needs of acquireing and examining, when the realization carries out optimal planning to patrolling and examining the route, can also explore the unknown environment of patrolling and examining at the in-process of patrolling and examining, thereby realize patrolling and examining by oneself in unknown non-structural environment, solve artifical remote control and patrol and examine the problem that the power excessive consumption and the manpower resources waste that the point patrolled and examined and lead to.
If a plurality of candidate exploration points exist in the exploration process, the selected optimal exploration point is marked as explored, the unselected candidate exploration point is marked as unexplored, and the candidate exploration point is marked as explored when waiting for being explored next time. If there is only one candidate point, this point is the optimal exploration point and is marked as an explored point.
When the inspection robot does not have a candidate point after being explored for a certain time, if the point of the inspection robot is the only candidate point generated by the previous exploration, the inspection robot should select autorotation to search whether the candidate point exists around the inspection robot; if the point where the inspection robot is located is one of the candidate points generated in the last search, the inspection robot can directly drive to the suboptimal search point generated in the last search without rotating.
Firstly, rotating the direction of the inspection robot, and because the direction of a target point may not be in the initial direction of the inspection robot, the inspection robot needs to turn the direction to point to the direction of the target point; secondly, the vehicle travels straight and stops advancing when reaching a target point; and finally, rotating the direction after reaching the candidate point, so that the inspection robot keeps the best posture to meet the requirement of exploration. It should be noted that if the current direction is exactly the same as the desired direction, the direction of rotation may not be used.
Preferably, in the step S1, the sensors are a three-dimensional camera SR-3000, a laser range finder, and a gyroscope.
The three-dimensional camera SR-3000 is developed by MESA Imaging AG of Switzerland in 2006, an array formed by 55 built-in LEDs in SR-3000 provides an infrared light source, the gray information and the depth information of corresponding points of a spatial scene point can be simultaneously obtained based on the TOF physical ranging principle, the image precision is 176 multiplied by 144 pixels, and the maximum effective distance reaches 7.5 m. The processing speed can reach 30 frames/second, the power consumption is only 1W, and the field range is 47.5 multiplied by 39.6 degrees. The imaging precision can reach 12.8cm2 for the environment imaging area with the depth of field of 7.5m (Width) multiplied by 5m (height), and the imaging precision is moderate, so that the method is suitable for the close-range obstacle detection of the inspection robot.
The laser range finder is an external sensor with high precision and high resolution. It firstly emits a laser beam to the target, and then receives the laser beam reflected by the target by the photoelectric element. The timer calculates the distance from the observer to the target by measuring the time difference from the emission to the reception of the laser beam. Compared with sonar, the method has the advantages of dense data points, fast scanning, short sampling period, long detection distance and higher angular resolution; compared with a vision sensor, the system is not influenced by ambient light, has simple data processing and small calculated amount, and thus becomes a main distance measuring sensor adopted by many current inspection robots.
The real-time changing working environment can enable the inspection robot to have pitching and rolling angles relative to a horizontal plane, so that the mobile inspection robot needs to know own attitude information besides acquiring external information of the surrounding environment, and the gyroscope is used for measuring attitude parameters of the inspection robot.
Further, the power transmission and transformation equipment is power equipment for power supply and transformation such as a transformer, a circuit breaker, an insulator, a power transmission line, a tower pole and the like.
Preferably, in step S2, the image of the scene in the non-structural environment acquired by the three-dimensional camera is divided into a long-range view (beyond the detection range of the line of sight of the robot), a sky (beyond the height that the robot can reach), a ground and an obstacle area. The current behavior and the walking route of the inspection robot cannot be influenced, and the inspection robot is a sky area, a distant view area and a ground area, which are called as regions of no interest; the position, type, shape, size and the like of the object in the rest area are closely related to the next action of the inspection robot, and important analysis and attention are needed, namely the area of interest.
Therefore, the optimal patrol point acquisition mode is as follows:
and step S31, marking the ground, the sky and the distant view in the gray level image according to the threshold value, and deleting the sky, the distant view and the ground area. The calculation amount in the subsequent algorithm processing process is reduced.
(1) Marking an area belonging to the sky on the image according to the height information of the pixel points of the image, and setting the gray value corresponding to the pixel point with the height value of the pixel larger than a set threshold value to zero;
(2) marking the ground area on the image according to the height information of the pixel points of the image, and setting the gray value of the pixel point, which is smaller than the gray value corresponding to the threshold pixel point, to zero;
(3) and marking a distant view area on the image according to the distance information of the image pixel points, and setting the gray value of the pixel point with the distance value larger than the corresponding gray value of the threshold pixel point to zero.
And step S32, binarizing the image for marking the sky and the ground, namely, representing the gray value of the interested region by using a non-zero value and representing the gray value of the non-interested region by using zero.
In the process of marking the ground at step S33, due to the unevenness of the ground, there may be a partial discontinuity in the region of non-interest, which should be attributed to the ground but appears as the region of interest. The binary image is processed by erosion and expansion operations in sequence to remove these discontinuities.
Step S34, separating the region of interest from the region of no interest by clustering, but not classifying the individual power supply and transformation devices inside the region of interest. The arrangement relationship among the three-dimensional data points reflects the geometric position information of each space point, and the pixel points in the power supply and transformation equipment area are represented as data points which are close to each other. The three-dimensional information of the pixels can be used to cluster the extracted regions of interest in the grayscale image. The premise is that the power supply and transformation equipment is assumed to be independent objects, such as insulators, power transmission lines and the like. Comparing two adjacent points of the current interest area in the image, and if the distance between the two points is within a certain range, determining that the two data points belong to the same class; if the threshold is exceeded, the two points are considered to belong to different classes. And starting the next round of data comparison by taking the current data point as the starting point of the new added category.
Calculating the Distance between any two points in the image by adopting Euclidean Distance (Euclidean Distance); and (5) adopting a Canny operator or a Sobel operator to carry out edge detection.
Further, the image preprocessing comprises the following steps: and eliminating noise generated in signal acquisition by adopting mean filtering or median filtering, and performing gray scale transformation on the image by adopting linear gray scale transformation, nonlinear gray scale transformation or piecewise linear gray scale transformation.
Preferably, in the step S3, an MCDM (Multi-criterion Decision Making) Multi-index Decision Making system is adopted, multiple evaluation indexes of each candidate inspection point are considered comprehensively according to evaluation conditions to obtain evaluation values, the evaluation values of the multiple candidate inspection points are compared, the candidate inspection point with the largest evaluation value is selected as an optimal inspection point, and the selection of the optimal inspection point from the multiple candidate inspection points is completed.
The leading edge theory is the earliest one originated from greedy strategy proposed by Thnm, which is that an inspection robot selects an unexplored area closest to the inspection robot with a certain probability q as an objective point of the next step, and selects other areas with probabilities 1-q as an exploration area. The Yamauchi system provides a leading edge theory, the leading edge is the edge of a detected area and an unexplored area which are already explored, so that more information gains can be obtained in the leading edge field, the theory can be used for rapidly and effectively obtaining environment information, the exploration action of the inspection robot based on the theory has initiative and purpose, the unknown environment can be effectively explored, and the unknown environment is converted into the known environment.
Wherein, the detected area refers to the area which can be scanned by the inspection robot sensor; undetected area: the inspection robot sensor is used for detecting the area which is not detected by the inspection robot sensor, and the area can be the area outside the range of the sensor or the rear part of an obstacle.
Although the leading edge theory can obtain a large information gain, other performance indexes are ignored, for example, the traveling distance is relatively long, the path consumption is relatively large, and the environment cannot meet the requirement of convenience due to excessive pursuit of the maximum information gain. Therefore, if the front edge point is directly selected as the candidate patrol inspection point, although larger information gain can be obtained, the environmental information is easy to lose, the requirement of the ergodic performance of the searched working environment cannot be met, the subsequently produced map is incomplete, and the front edge point is generally positioned in the detected edge area, so that the journey loss of the patrol inspection robot is larger. In summary, the leading edge point is not the optimal candidate inspection point, the leading edge theory is not the perfect theory, and the leading edge theory only provides an active detection idea to drive the inspection robot to drive to the unknown environment.
Since the number of the candidate patrol points in the explored area may be one or more, the candidate patrol points at different positions have different attributes, and the main attribute parameters include the distance from the candidate patrol point to the current position of the robot, the maximum information gain obtained at the point, and the rotation angle required by the patrol robot to reach the point.
Different evaluation criteria may be employed in evaluating a candidate waypoint. Usually, the simplest one is the route consumption, and the evaluation criterion selects the candidate inspection point with the smallest route consumption as the optimal candidate inspection point. Still other evaluation criteria combine the range consumption with other criteria, such as information gain. The optimal candidate inspection point selected by the criteria considers the distance consumption of reaching the candidate inspection point and the information gain of the candidate inspection point.
Wherein, the distance is consumed: the method is characterized in that in a searching process, the distance of a route traveled by the inspection robot from a current point to a target candidate inspection point p is determined.
Information gain: the method is characterized in that new environment information acquired at a candidate patrol point p can be used for expressing information gain in two ways, namely, the newly acquired environment area is used, and the length of a free boundary where the candidate patrol point p is located is used.
Rotation angle: the rotation angle required for the direction of the robot at the current position to reach the selected candidate inspection point is defined.
Suppose that the evaluation value corresponding to the i-th evaluation condition of the candidate patrol point p is ui(p) and evaluating the value uiAnd (p) is between 0 and 1, and the value is used as a quality standard for measuring the ith evaluation condition corresponding to the candidate patrol point p. Generally, it is set that the larger the evaluation value of the evaluation condition at a certain candidate inspection point, the more excellent the condition corresponding to the representative candidate inspection point is, and conversely, the smaller the evaluation value, the less excellent the condition is. The information gain of the condition is evaluated, and the larger the information gain value is, the more unknown environment information can be acquired by representing the candidate patrol point. Such an evaluation value is generally calculated by the following formula.
Figure BDA0003329078950000121
In the formula (1), uc(p) represents the information gain evaluation value of the candidate patrol point p, and L is the set of all candidate patrol points. c (p) is the information gain value of the candidate patrol point. The formula shrinks the information gain of the candidate patrol points to be between 0 and 1, so that the later calculation is facilitated.
However, there are some evaluation values that are calculated such that, for example, the distance consumed, the larger the distance consumed, the less desirable the evaluation condition, and if still calculated by equation (1), the result may be the opposite, and the larger the evaluation value, the less desirable the point, which conflicts with the initial assumption, and the later comprehensive calculation is not facilitated.
This is also the case in the judgment condition of the rotation angle, so that the calculation is generally performed by the formula (2).
Figure BDA0003329078950000122
In the formula (2), uc(p) represents the estimated path loss value for candidate waypoint p, and L is the set of all candidate waypoints. c (p) is the information gain value of the candidate patrol point. The information gain of the candidate tour points still shrinks between 0 and 1.
In order to obtain the optimal candidate inspection point, the inspection robot must consider the evaluation values of all candidate inspection points, calculate and compare all the evaluation values, and select the candidate inspection point with the largest evaluation value as the optimal candidate inspection point. The evaluation function must take into account all candidate inspection points and all judgment conditions for each candidate inspection point, only so that the selected candidate inspection point is optimal.
Suppose N is a set of N candidate waypoint evaluation conditions, and j represents the jth evaluation condition in N, such as a path loss. L is a set of L candidate waypoints, and p represents the p-th candidate waypoint in L. Then uj(p) represents the j-th evaluation condition of the p-th candidate patrol point.
Three evaluation conditions are applied, namely, the path loss, the information gain and the rotation angle. Then, in conjunction with the above, N ═ path loss, information gain, rotation angle }. The simplest approach is to select a weighted sum function as the evaluation function. If the inspection robot is required to obtain more information gains, a larger value is selected as the coefficient of the information gain, as shown in table 1.
TABLE 1 weighting coefficient table for evaluation index
Judging conditions Loss of path Information gain Rotation angle
Weighting coefficient 0.3 0.4 0.3
Suppose there are three candidate patrol points, i.e., L ═ p1,p2,p3And the evaluation criteria are shown in table 2.
TABLE 2 weighted evaluation value Table
Candidate patrol inspection point Loss of path Information gain Rotation angle Weighted sum
P1 0.2 0.9 0.7 0.63
P2 0.6 0.5 0.1 0.41
P3 0.9 0.6 0.3 0.60
Calculating to obtain candidate patrol point p1Is the largest, so p is selected1And the optimal candidate inspection point is obtained. However, p1P is reached although there is a large information gain1The path loss required for the point is large. The point can be regarded as the optimal candidate inspection point, and has a certain relation with the selected weighting coefficient, and the optimal candidate inspection point is expected to have larger information gain; p is a radical of1Can become an optimal candidate inspection point. This has the problem that, although information gain is obtained, the distance is sacrificed and the inspection robot travels a greater distance. The same may tend to select candidate waypoints with less path loss, which may not result in a large information gain, which is also a common fault for the weighted average method. The evaluation conditions in the evaluation criteria are restricted from each other, and if they are not considered, they cannot be considered simultaneously.
The MCDM provides a solution to this problem and the following introduces this method and related concepts. Such an integration function μ p (N) is first defined as belonging to [0,1] and satisfies the following condition.
(1) Mu (empty set) ═ 0, mu (N) ═ 1
(2)
Figure BDA0003329078950000131
Let a belong to p (Ν), μ (a) represents the weight coefficient of the candidate patrol point a. In this case, the weighting coefficient not only weights only one candidate patrol point, but also integrally weights a plurality of candidate patrol points. The evaluation function u (P) corresponding to the candidate patrol point P has the following expression.
Figure BDA0003329078950000141
The evaluation values corresponding to the n evaluation conditions of the candidate patrol point p are arranged in order from small to large as shown below.
u1(p)≤...un(p). ltoreq.1 and assuming u0(p)=0,Aj={i∈N|uj(p)≤ui(p)un(p), different integration coefficients μ will result in different estimates, assuming c1And c2The integration function values of (d) are respectively mu (c)1) And μ (c)2);
(1) If μ ({ c)1,c2})<μ(c1)+μ(c2) Then the two criteria are redundant;
(2) if μ ({ c)1,c2})>μ(c1)+μ(c2) Then the two criteria are synergistic;
the same principle applies to more than two criteria, where the weighted average method is a special case: mu ({ c)1,c2})=μ(c1)+μ(c2) The integral function then has the effect of a weighted average.
The above example is also explained next.
TABLE 3 Single index integration coefficient Table
Judging conditions Loss of path Information gain Rotation angle
Judging conditions 0.3 0.4 0.3
Assuming that there is redundancy between the path loss and the rotation angle, there is a synergy between the path loss and the information gain, and between the information gain and the rotation angle, and the specific integration function values are shown in table 4.
TABLE 4 Multi-index integration coefficient table
Judging conditions Path loss and rotation angle Path loss and information gain Information gain and rotation angle
Judging conditions 0.5 0.8 0.8
From the integration function defining the formula, the following values can be obtained.
TABLE 5 Integrated function values
Figure BDA0003329078950000142
Figure BDA0003329078950000151
Through the above series of calculations, P1 is finally obtained as the optimal exploration point. In reverse view of the evaluation condition of P1, point P1 has the smallest distance loss and the largest information gain, and point P1 is still the optimal exploration point in the exploration of the three candidate inspection points, although the rotation angle is large.
The experiments were performed in the MATLAB simulation environment below. The working environment is 10m by 10m unknown environment, and a plurality of irregular obstacles are randomly arranged in the working environment. Assuming that the working environment is unknown before being explored, the obstacles in the figure are represented by yellow areas. Used patrolling and examining robot of experiment is equipped with laser range finder, three-position camera and gyroscope, and laser sensor detection range is 2m, and scanning angle is 180, and the resolution angle is 0.5. Assuming that the minimum step length of the inspection robot is 0.2m, the laser range finder takes a sensor value every 5 degrees, and the calculated amount is simplified. The specific exploration process is shown in FIGS. 3-5
The walking route of the inspection robot is represented by small circle connecting lines in the simulation diagram, the sector area is a sensor component scanning area, black points on the edge of the sector area represent sensor scanning points, the rest of the diagram represents an object to be detected, and when the inspection is carried out, the inspection robot cannot penetrate through the inside of the object to be detected due to blocking of the object to be detected and only needs to move around the object to be detected. The simulation experiment is a process of exploring the working environment by the inspection robot, and three simulation graphs respectively represent three steps of exploring.
In order to solve the technical problems, the invention provides a technical scheme that: the utility model provides an unmanned aerial vehicle sends transformer equipment system of patrolling and examining under non-structural environment, refers to fig. 6 and shows, including perception module A, independently module B and removal module C, wherein:
the sensing module A: the sensor component is used for collecting environmental information of the power transmission and transformation equipment;
an autonomous module B: processing the obtained environment information image, extracting image features in the processed image, marking power transmission and transformation equipment needing to be detected as candidate inspection points, selecting an optimal inspection point from all the candidate inspection points according to judgment conditions, and sending a control instruction for moving the optimal inspection point to the inspection point to a moving module;
a moving module C: and receiving a control instruction sent by the main module, and adjusting corresponding power to enable the power to move to the optimal inspection point.
In some embodiments, functions of or modules included in the apparatus provided in the embodiments of the present disclosure may be used to execute the method described in the above method embodiments, and specific implementation thereof may refer to the description of the above method embodiments, and for brevity, will not be described again here.
The foregoing description of the various embodiments is intended to highlight various differences between the embodiments, and the same or similar parts may be referred to each other, and for brevity, will not be described again herein.
The above embodiments are only preferred embodiments of the present invention, and are not intended to limit the present invention in any way, and all simple modifications, equivalent changes and modifications made to the above embodiments according to the technical spirit of the present invention still fall within the scope of the technical solution of the present invention.

Claims (9)

1. An unmanned aerial vehicle power transmission and transformation equipment inspection method in a non-structural environment comprises the following steps: the method comprises the following steps that firstly, an unmanned aerial vehicle collects environmental information of power transmission and transformation equipment through a carried sensor;
step two, processing the obtained environment information image, extracting image features in the processed image, marking the power transmission and transformation equipment needing to be detected as a candidate inspection point, if the candidate inspection point exists in the exploration environment, skipping to step three, and if the candidate inspection point does not exist, skipping to step four;
selecting an optimal inspection point from all candidate inspection points, driving to the inspection point, marking the inspection point as inspected, and returning to the step one;
step four, the inspection robot inquires whether candidate inspection points exist again, if so, the inspection robot returns to the step three, and if not, the inspection robot jumps to the step five;
and step five, checking whether all the candidate patrol inspection points are marked as patrol inspection, if unknown candidate patrol inspection points exist, driving the patrol inspection robot to the point, and then jumping to the step one, if unknown candidate patrol inspection points do not exist. The inspection is ended.
2. The unmanned aerial vehicle power transmission and transformation equipment inspection method under the unstructured environment, which is characterized in that: in the first step, the sensors are a three-dimensional camera SR-3000, a laser range finder and a gyroscope.
3. The unmanned aerial vehicle power transmission and transformation equipment inspection method under the unstructured environment, which is characterized in that: in the second step, the image processing and feature extraction steps are as follows: 1) preprocessing the image to obtain image gray scale and three-dimensional information;
2) marking the ground, the sky and the distant view in the gray level image according to the threshold value, and deleting the sky, the distant view and the ground area;
3) carrying out binarization on the images for marking the sky and the ground, wherein the gray value of the region of interest is represented by a non-zero value, and the gray value of the region of non-interest is represented by zero;
4) clustering the extracted interesting regions in the gray level image by utilizing the three-dimensional information of the pixels, and separating the interesting regions from the non-interesting regions;
5) comparing two adjacent points of the current interest area in the image, if the distance between the two points is within a certain range, considering that the two data points belong to the same class, if the distance exceeds a threshold value, considering that the two points belong to different classes, and starting the next round of data comparison by taking the current data point as the starting point of the new class to finish secondary cluster analysis;
6) and determining and extracting edges through edge detection, and drawing the target object.
4. The unmanned aerial vehicle power transmission and transformation equipment inspection method under the unstructured environment, which is characterized in that: the image preprocessing comprises the following steps: and eliminating noise generated in signal acquisition by adopting mean filtering or median filtering, and performing gray scale transformation on the image by adopting linear gray scale transformation, nonlinear gray scale transformation or piecewise linear gray scale transformation.
5. The unmanned aerial vehicle power transmission and transformation equipment inspection method under the unstructured environment, which is characterized in that: the image marking method comprises the following steps: 1) marking an area belonging to the sky on the image according to the height information of the pixel points of the image, and setting the gray value corresponding to the pixel point with the height value of the pixel larger than a set threshold value to zero;
2) marking the ground area on the image according to the height information of the pixel points of the image, and setting the gray value of the pixel point, which is smaller than the gray value corresponding to the threshold pixel point, to zero;
3) and marking a distant view area on the image according to the distance information of the image pixel points, and setting the gray value of the pixel point with the distance value larger than the corresponding gray value of the threshold pixel point to zero.
6. The unmanned aerial vehicle power transmission and transformation equipment inspection method under the unstructured environment, which is characterized in that: in the process of marking the ground, processing the binary image by corrosion and expansion operation on partial discontinuous points existing in the non-interested region to remove the discontinuous points.
7. The unmanned aerial vehicle power transmission and transformation equipment inspection method under the unstructured environment, which is characterized in that: in the third step, the method for determining the optimal candidate inspection point comprises the following steps: an MCDM (Multi-criterion Decision Making) Multi-index Decision Making system is adopted, a plurality of evaluation indexes of each candidate inspection point are comprehensively considered according to evaluation conditions to obtain evaluation values, the evaluation values of the candidate inspection points are compared, the candidate inspection point with the largest evaluation value is selected as an optimal inspection point, and the optimal inspection point is selected from the candidate inspection points.
8. The unmanned aerial vehicle power transmission and transformation equipment inspection method under the unstructured environment, which is characterized in that: the evaluation conditions are as follows: path loss, information gain, and rotation angle, wherein,
and (3) path consumption: in the process of one-time routing inspection, the routing inspection robot travels the distance from the current routing inspection point to the target routing inspection point;
information gain: the method comprises the steps that new environment information is obtained at a patrol point and comprises a newly obtained environment area and the length of a free boundary where the patrol point is located;
rotation angle: the rotation angle is the rotation angle required for the direction of the robot at the current position to reach the selected patrol point.
9. The utility model provides an unmanned aerial vehicle sends substation equipment system of patrolling and examining under non-structural environment which characterized in that: including perception module, autonomic module and mobile module, wherein:
a perception module: the sensor component is used for collecting environmental information of the power transmission and transformation equipment;
an autonomous module: processing the obtained environment information image, extracting image features in the processed image, marking power transmission and transformation equipment needing to be detected as candidate inspection points, selecting an optimal inspection point from all the candidate inspection points according to judgment conditions, and sending a control instruction for moving the optimal inspection point to the inspection point to a moving module;
a moving module: and receiving a control instruction sent by the main module, and adjusting corresponding power to enable the power to move to the optimal inspection point.
CN202111274981.5A 2021-10-29 2021-10-29 Unmanned aerial vehicle power transmission and transformation equipment inspection method and system in non-structural environment Active CN114217641B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111274981.5A CN114217641B (en) 2021-10-29 2021-10-29 Unmanned aerial vehicle power transmission and transformation equipment inspection method and system in non-structural environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111274981.5A CN114217641B (en) 2021-10-29 2021-10-29 Unmanned aerial vehicle power transmission and transformation equipment inspection method and system in non-structural environment

Publications (2)

Publication Number Publication Date
CN114217641A true CN114217641A (en) 2022-03-22
CN114217641B CN114217641B (en) 2024-05-07

Family

ID=80696376

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111274981.5A Active CN114217641B (en) 2021-10-29 2021-10-29 Unmanned aerial vehicle power transmission and transformation equipment inspection method and system in non-structural environment

Country Status (1)

Country Link
CN (1) CN114217641B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115685975A (en) * 2022-09-14 2023-02-03 国家电网公司西南分部 No-signal off-line operation method and system for power transmission line inspection robot

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109460033A (en) * 2018-12-14 2019-03-12 杭州申昊科技股份有限公司 A kind of intelligent inspection robot
CN110580717A (en) * 2019-08-15 2019-12-17 成都优艾维智能科技有限责任公司 Unmanned aerial vehicle autonomous inspection route generation method for electric power tower
CN110610556A (en) * 2018-06-15 2019-12-24 北京京东尚科信息技术有限公司 Robot inspection management method and system, electronic device and storage medium
CN110879601A (en) * 2019-12-06 2020-03-13 电子科技大学 Unmanned aerial vehicle inspection method for unknown fan structure
CN110908401A (en) * 2019-12-06 2020-03-24 电子科技大学 Unmanned aerial vehicle autonomous inspection method for unknown tower structure

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110610556A (en) * 2018-06-15 2019-12-24 北京京东尚科信息技术有限公司 Robot inspection management method and system, electronic device and storage medium
CN109460033A (en) * 2018-12-14 2019-03-12 杭州申昊科技股份有限公司 A kind of intelligent inspection robot
CN110580717A (en) * 2019-08-15 2019-12-17 成都优艾维智能科技有限责任公司 Unmanned aerial vehicle autonomous inspection route generation method for electric power tower
CN110879601A (en) * 2019-12-06 2020-03-13 电子科技大学 Unmanned aerial vehicle inspection method for unknown fan structure
CN110908401A (en) * 2019-12-06 2020-03-24 电子科技大学 Unmanned aerial vehicle autonomous inspection method for unknown tower structure

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115685975A (en) * 2022-09-14 2023-02-03 国家电网公司西南分部 No-signal off-line operation method and system for power transmission line inspection robot

Also Published As

Publication number Publication date
CN114217641B (en) 2024-05-07

Similar Documents

Publication Publication Date Title
CN110988912B (en) Road target and distance detection method, system and device for automatic driving vehicle
WO2020134082A1 (en) Path planning method and apparatus, and mobile device
CN108303096B (en) Vision-assisted laser positioning system and method
US6826293B2 (en) Image processing device, singular spot detection method, and recording medium upon which singular spot detection program is recorded
CN111968128B (en) Unmanned aerial vehicle visual attitude and position resolving method based on image markers
CN109001757B (en) Parking space intelligent detection method based on 2D laser radar
CN114782626B (en) Transformer substation scene map building and positioning optimization method based on laser and vision fusion
CN113298035A (en) Unmanned aerial vehicle electric power tower detection and autonomous cruise method based on image recognition
CN111257892A (en) Obstacle detection method for automatic driving of vehicle
El Yabroudi et al. Adaptive DBSCAN LiDAR point cloud clustering for autonomous driving applications
CN114782729A (en) Real-time target detection method based on laser radar and vision fusion
CN116503803A (en) Obstacle detection method, obstacle detection device, electronic device and storage medium
CN113096181B (en) Method and device for determining equipment pose, storage medium and electronic device
CN112505050A (en) Airport runway foreign matter detection system and method
CN114217641B (en) Unmanned aerial vehicle power transmission and transformation equipment inspection method and system in non-structural environment
CN117589167A (en) Unmanned aerial vehicle routing inspection route planning method based on three-dimensional point cloud model
CN111507341B (en) Method, device and equipment for adjusting target bounding box and storage medium
CN112542800A (en) Method and system for identifying transmission line fault
CN112987720A (en) Multi-scale map construction method and construction device for mobile robot
Valseca et al. Real-time lidar-based semantic classification for powerline inspection
CN115453570A (en) Multi-feature fusion mining area dust filtering method
CN115797397A (en) Method and system for robot to autonomously follow target person in all weather
CN115267827A (en) Laser radar harbor area obstacle sensing method based on height density screening
CN111435086B (en) Navigation method and device based on splicing map
CN113379738A (en) Method and system for detecting and positioning epidemic trees based on images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant