CN111309048B - Method for detecting autonomous flight along road by combining multi-rotor unmanned aerial vehicle with road - Google Patents

Method for detecting autonomous flight along road by combining multi-rotor unmanned aerial vehicle with road Download PDF

Info

Publication number
CN111309048B
CN111309048B CN202010130574.6A CN202010130574A CN111309048B CN 111309048 B CN111309048 B CN 111309048B CN 202010130574 A CN202010130574 A CN 202010130574A CN 111309048 B CN111309048 B CN 111309048B
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
road
latitude
longitude
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010130574.6A
Other languages
Chinese (zh)
Other versions
CN111309048A (en
Inventor
杨路
杨嘉耕
段思睿
杨磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing University of Post and Telecommunications
Original Assignee
Chongqing University of Post and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing University of Post and Telecommunications filed Critical Chongqing University of Post and Telecommunications
Priority to CN202010130574.6A priority Critical patent/CN111309048B/en
Publication of CN111309048A publication Critical patent/CN111309048A/en
Application granted granted Critical
Publication of CN111309048B publication Critical patent/CN111309048B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention relates to a method for detecting autonomous flight along a road by combining a multi-rotor unmanned aerial vehicle with the road, and belongs to the field of unmanned aerial vehicle flight control. The method comprises the following steps: s1: acquiring video information of a road right below the unmanned aerial vehicle; s2: transmitting the acquired video information to a ground station through a 4G link; s3: road detection is carried out on the obtained video image through the ground station, and the position and the angle of the road center line in the whole video image are extracted; s4: and (3) selecting a mode by utilizing the relation between the extracted unmanned aerial vehicle and the road center line l and the decision box R, combining GPS information returned by the unmanned aerial vehicle and a road GPS data set, inputting the GPS information and the road GPS data set into an eight-direction PID strategy controller based on Kalman filtering to obtain a yaw control quantity, and sending a control instruction to the unmanned aerial vehicle through a ground station, wherein the unmanned aerial vehicle is controlled to fly above a road first, and then the unmanned aerial vehicle is controlled to fly along the road autonomously. The invention can fly independently along the road accurately, and avoid collision with surrounding environment to a greater extent.

Description

Method for detecting autonomous flight along road by combining multi-rotor unmanned aerial vehicle with road
Technical Field
The invention belongs to the field of unmanned aerial vehicle flight control, and relates to a Kalman filtering-based multi-rotor unmanned aerial vehicle path accurate positioning fusion control method.
Background
At present, a civil unmanned aerial vehicle is controlled by a radio frequency remote controller, the control distance is short, signals are easy to be blocked by obstacles, a 4G network can be spread over main areas of various large cities under the communication condition of today, the requirements of most users can be met, the congenital advantage is that personal communication is very convenient, and an excellent solving condition is provided for solving the problem of limiting the communication distance encountered by the civil unmanned aerial vehicle remote communication technology, so that the remote control of the unmanned aerial vehicle through 4G network communication becomes an important direction for the development of the future unmanned aerial vehicle remote control.
However, with the continuous improvement of communication, computer and control technologies, for the application of 4G unmanned aerial vehicle control, there is still a great development space in the future, such as delivering express in cities, supervising road traffic in certain areas, etc., all unmanned aerial vehicles need to fly to target points along a predetermined path, in the flying process, besides considering the potential safety problems in terms of use and operation of unmanned aerial vehicles, the precise flying along the predetermined path is maintained, which has very important influence on the aspects of flying safety, flying efficiency and airspace management, so that it is very necessary to build a complete unmanned aerial vehicle along-path flying system.
Among the prior art, the patent application with publication number CN 105549603A "an intelligent road inspection control method of multi-rotor unmanned aerial vehicle", discloses a multi-rotor unmanned aerial vehicle obtains the information under the unmanned aerial vehicle through the cloud deck, obtains the road center line through image processing, then carries out yaw control to unmanned aerial vehicle through the offset value of contrast road center line and unmanned aerial vehicle actual position. However, it is not described how the unmanned aerial vehicle autonomously searches the closest road, and it is difficult to ensure the flight of the unmanned aerial vehicle without adding the GPS of the road and the GPS value of the unmanned aerial vehicle as verification standards and correction conditions in case of failure of detection.
Thus, there is a need for a precisely positioned unmanned aerial vehicle technology that is capable of autonomous control of the flight along a route.
Disclosure of Invention
In view of the above, the invention aims to provide a Kalman filtering-based method for accurately positioning and fusion control along a road of a multi-rotor unmanned aerial vehicle, which is applied to the along-road flight of the multi-rotor unmanned aerial vehicle and has various applications in practical environments, including traffic supervision, express delivery and the like; meanwhile, the unmanned aerial vehicle can fly along the road accurately, so that collision with surrounding environment can be avoided to a large extent, and stable flying of the unmanned aerial vehicle is ensured.
In order to achieve the above purpose, the present invention provides the following technical solutions:
a method for multi-rotor unmanned aerial vehicle to detect autonomous fly along a road in combination with the road, comprising the following steps:
s1: the image sensor on the unmanned aerial vehicle is fixed vertically downwards, and the image sensor of the unmanned aerial vehicle is connected and controlled through an onboard embedded mode so as to acquire video information right below the unmanned aerial vehicle;
s2: transmitting the video information acquired by the image sensor to a ground station through a 4G link;
s3: the method comprises the steps of detecting a road through a ground station, and extracting the position and the angle of a road center line l in the whole video image;
s4: and selecting an unmanned aerial vehicle mode by utilizing the relation between the extracted unmanned aerial vehicle and the road center line l and a decision box R, combining GPS information returned by the unmanned aerial vehicle and a road GPS data set of a region where the GPS information is located, inputting the GPS information into an eight-direction PID strategy controller based on Kalman filtering to obtain a yaw control quantity, sending a control instruction to the unmanned aerial vehicle through a ground station, controlling the unmanned aerial vehicle to fly above a road at first, and controlling the unmanned aerial vehicle to fly along the road autonomously.
Further, in step S3, the road detection process performed by the ground station on the obtained video image includes processing the road surface image information transmitted back by the unmanned aerial vehicle, and the specific out process includes the following steps:
s31: firstly, calibrating a video obtained from a camera, and clearing lens distortion;
s32: carrying out graying treatment on the image;
s33: extracting information in the information by using a Canny edge detection algorithm;
s34: locating the pavement line by using Hough transformation to find out a straight line;
s35: and finding out a central line l of the road through K-means clustering, and recording an included angle theta.
Further, in the step S4, according to the processing result of the video image, the specific step of obtaining the yaw control amount by combining with the eight-direction PID policy controller based on the kalman filtering is as follows:
s41: the video decision box R decides the mode and the input quantity of the eight-direction PID strategy controller;
s42: the ground station judges the mode of the unmanned aerial vehicle according to whether the road center line l is in the judging box R;
(1) If the road center line l is in the decision box R, determining that the unmanned aerial vehicle is above the road; longitude and latitude C according to current actual position of unmanned aerial vehicle x 、C y Longitude and latitude P of the position of the road x 、P y And longitude and latitude D of target point of next stage of along-road flight x 、D y Calculating to obtain the direction angle and the distance between the unmanned aerial vehicle and the target point;
(2) If the road center line l is outside the decision box R, the unmanned aerial vehicle is offset from the road, the extracted road center line l and the longitude and latitude C of the current actual position of the unmanned aerial vehicle are utilized x 、C y Longitude and latitude D of target point at next stage of along-road flight x 、D y Calculating to obtain the direction angle and the flight distance of the unmanned aerial vehicle required to fly;
the direction angle and the distance between the unmanned aerial vehicle and the target point calculated according to the mode (1) or (2) are quantized through a constant speed V and a fixed eight control directions O; then, the value CP of the unmanned aerial vehicle position GPS is set x 、CP y And the control time t is used as the input of the controller to control the unmanned aerial vehicle; the obtained actual unmanned aerial vehicle feedback position C' x 、C' y From the theoretical position C sx 、C sy Simultaneously inputting the fusion position C into a Kalman filter to obtain the fusion position C of the unmanned aerial vehicle ax 、C ay As a next condition, the process advances to step S44;
s43: if the unmanned aerial vehicle cannot detect the road information below the unmanned aerial vehicle, firstly lifting the height of the unmanned aerial vehicle, and if the unmanned aerial vehicle can detect the road, selecting a mode for input according to the detected relative relation between the central line l of the road and the decision box R; if the road cannot be detected, the current longitude and latitude C of the unmanned aerial vehicle is utilized x 、C y Degree of localization D 'with the last target point' x 、D' y And the direction theta' of the unmanned plane unit displacement D p Inputting and outputting yaw control quantity to complete unmanned plane control;
s44: after the operation is finished, road detection is carried out below the current position of the unmanned aerial vehicle, and judgment is carried out again according to the detection condition until the unmanned aerial vehicle reaches a target place.
Further, in the step S41, the calculation formula of the video decision box R is:
Figure BDA0002395671390000031
wherein ,RA For the width of the shot image, h is the height of the unmanned aerial vehicle from the ground, and k is the scaling factor.
Further, in the step S42, if the road center line l is in the decision box R, the longitude and latitude C are determined according to the actual position of the unmanned aerial vehicle x 、C y Longitude and latitude P of the position of the road x 、P y The weights are combined to the origin by the following formula:
Figure BDA0002395671390000032
establishing a coordinate system taking north as a Y axis and east as an X axis, taking the position of a next target stage point in the road flight as a target point, and combining the longitude and latitude D of the target point x 、D y Obtaining a direction angle alpha and a distance Dis between the unmanned aerial vehicle and the target point:
Figure BDA0002395671390000033
further, in the step S42, if the road center line l obtained by the road detection is outside the decision box R, the calculation formula of the direction angle and the flight distance of the unmanned aerial vehicle required to fly is:
Figure BDA0002395671390000034
wherein ,
Figure BDA0002395671390000035
the invention has the beneficial effects that: the invention can realize the autonomous control of unmanned plane along-road flight; meanwhile, under the condition of meeting the 4G network environment, when the traditional unmanned aerial vehicle is applied on a large scale, the manpower cost is reduced, and the unmanned aerial vehicle is ensured to fly to the purpose safely, stably and independently under the condition that the unmanned aerial vehicle can use less manpower operation cost. The invention can also ensure that the accounting unmanned aerial vehicle flies along the road accurately, so that the collision with the surrounding environment can be avoided to a greater extent, and the stable flying of the unmanned aerial vehicle is ensured.
Additional advantages, objects, and features of the invention will be set forth in part in the description which follows and in part will become apparent to those having ordinary skill in the art upon examination of the following or may be learned from practice of the invention. The objects and other advantages of the invention may be realized and obtained by means of the instrumentalities and combinations particularly pointed out in the specification.
Drawings
For the purpose of making the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in the following preferred detail with reference to the accompanying drawings, in which:
FIG. 1 is a flow chart of a method for autonomous flight of a multi-rotor unmanned aerial vehicle along a path;
FIG. 2 is a schematic view of a road centerline within a decision box;
FIG. 3 is a schematic view of the center line of the roadway outside the decision box;
FIG. 4 is a schematic view of the position of the drone in the mode of FIG. 3;
FIG. 5 is a block diagram of an eight-way PID policy controller;
fig. 6 is a schematic flow chart of the multi-rotor unmanned aerial vehicle combined with road detection along-path autonomous flight method.
Detailed Description
Other advantages and effects of the present invention will become apparent to those skilled in the art from the following disclosure, which describes the embodiments of the present invention with reference to specific examples. The invention may be practiced or carried out in other embodiments that depart from the specific details, and the details of the present description may be modified or varied from the spirit and scope of the present invention. It should be noted that the illustrations provided in the following embodiments merely illustrate the basic idea of the present invention by way of illustration, and the following embodiments and features in the embodiments may be combined with each other without conflict.
Referring to fig. 1 to 6, the method for detecting the autonomous flight along the road by combining the multi-rotor unmanned aerial vehicle with the road is provided, so as to realize the autonomous control of the unmanned aerial vehicle along the road, and specifically realize the following steps:
step 1: and preprocessing map roads in a possible flight range of the unmanned aerial vehicle. And dividing each road into a plurality of sections of GPS broken lines by utilizing an interface of the Goldmap, storing the initial GPS, the end GPS and 5 data of an included angle between each section of broken line in each road and the north direction, comparing the data with the GPS data of the unmanned aerial vehicle, and controlling the unmanned aerial vehicle to fly to the upper part of the road and correct.
Step 2: the ground station performs task issuing on the unmanned aerial vehicle. The ground station client contains map information, the position of unmanned aerial vehicle on the map, unmanned aerial vehicle's information and unmanned aerial vehicle passback's video information to can carry out manual control to unmanned aerial vehicle when necessary, unmanned aerial vehicle is connected with ground station through 4G link, carries out information interaction with ground station, and after unmanned aerial vehicle took off, will select appointed target point, ground station utilizes the interface of high land map, plans the route of unmanned aerial vehicle along the way flight, exports the route with the form of multistage broken line.
Step 3: the ground station processes the pavement image information transmitted back by the unmanned aerial vehicle, and the specific processing process is as follows:
1. and calibrating the video obtained from the camera, and clearing lens distortion. Geometric distortion can be generated in the image acquisition process, and the geometric correction of the unmanned aerial vehicle image is mainly to correct nonlinear distortion of a digital camera lens. The unmanned aerial vehicle aerial camera is a digital camera, optical distortion exists at the edge of an image shot by the unmanned aerial vehicle aerial camera, the distortion can shift the position of an actual image point in the image, the coordinate of the image point is shifted, the ground position of an actual object is changed, and the detection result is influenced finally, so that the subsequent processing can be performed after the distortion difference is corrected. The specific processing mode of correcting distortion is shown in a formula (1):
Figure BDA0002395671390000051
2. and carrying out graying treatment on the image. Graying treatment is carried out on the picture subjected to distortion correction, a weighted average method is adopted for treatment, and a corresponding gray image can be obtained by a formula (2):
V grey ==0.299R+0.587G+0.114B (2)
3. and extracting information in the information by using a Canny edge detection algorithm. The method is a second-order differential algorithm, a Canny edge detection algorithm firstly uses a Gaussian filter to carry out smooth denoising on an image, then finds the intensity gradient of the image, applies a non-maximum suppression technology to eliminate edge false detection, then applies a double-threshold method to determine a possible boundary, and finally utilizes a hysteresis technology to track the boundary. The Canny method is based on three basic goals: (1) low error rate; (2) edge points should be well located; (3) a single edge point response. Is a relatively good method for detecting the edges of the lane lines.
4. The straight line is found by means of hough transform to locate the line. Hough transform is a feature extraction technique. It can be used in techniques for isolating features of specific shapes in images, in the fields of image analysis, computer vision and digital image processing. The goal is to find imperfect instances of objects within a particular type of shape by a voting procedure. This voting procedure is performed in a parameter space in which candidates are obtained as local maxima in a so-called accumulator space, which is explicitly built by an algorithm for computing the hough transform. The most basic hough transform is to detect straight lines from black and white images. The main advantage of Hough transform is that it is tolerant of gaps in feature boundary descriptions and relatively immune to image noise.
5. And finding out a central line l of the road through K-means clustering, and recording an included angle theta. The K-Means algorithm is a simple iterative clustering algorithm, which uses distance as a similarity index, so that K classes in a given dataset are found, and the center of each class is obtained according to the average of all the values in the class, and is described by a clustering center. As shown in formula (3):
Figure BDA0002395671390000061
step 4: according to the processing result of the video image, the eight-direction PID strategy controller based on Kalman filtering is combined, and the specific implementation mode is as follows:
1. the video decision block determines the mode and input to which the eight-way PID policy controller is to be put. The video decision box size is derived from equation (4):
Figure BDA0002395671390000062
wherein ,RA For the width of the shot image, h is the height of the unmanned aerial vehicle from the ground, and k is the scaling factor.
2. The ground station judges the mode of the unmanned aerial vehicle according to whether the road center line l is in the judging box R.
1) If the image display result is shown in fig. 2, and the road center line l is in the decision box R, it can be determined that the unmanned aerial vehicle is above the road; longitude and latitude C according to actual position of unmanned aerial vehicle x 、C y Longitude and latitude P of the position of the road x 、P y The weights are combined to the origin by equation (5);
Figure BDA0002395671390000063
establishing a coordinate system taking north as a Y axis and east as an X axis, taking the position of a next target stage point in the road flight as a target point, and passing through longitude and latitude D of the target point x 、D y Formula (6):
Figure BDA0002395671390000064
the direction angle alpha and the distance Dis between the unmanned aerial vehicle and the target point are obtained and are input into eight PID strategy controllers based on Kalman filtering, yaw control quantity is taken as output, and a control instruction is sent to the unmanned aerial vehicle through a 4G signal to control the unmanned aerial vehicle to fly to the next target point;
2) If the image display result is shown in fig. 3, and the road center line l obtained by the road detection is outside the decision box R, the unmanned aerial vehicle is offset from the road, and the extracted road center line l and the current longitude and latitude C of the unmanned aerial vehicle are used x 、C y Longitude and latitude D of target point at next stage of along-road flight x 、D y Calculated by way of fig. 4 and with formula (7):
Figure BDA0002395671390000071
the direction angle and the flight distance of the unmanned aerial vehicle required to fly are obtained, as shown in a formula 8:
Figure BDA0002395671390000072
inputting the direction angle and the flight distance of the unmanned aerial vehicle required to fly into an eight-direction PID strategy controller based on Kalman filtering, taking the yaw control quantity as output, and sending a control instruction to the unmanned aerial vehicle through a 4G signal to control the unmanned aerial vehicle to more effectively and accurately complete the flight task;
4) If the unmanned aerial vehicle cannot detect the road information under the unmanned aerial vehicle, the height of the unmanned aerial vehicle is firstly increased, if the unmanned aerial vehicle can detect the road, according to the relative relation between the detected road center line l and the decision box R,selecting a mode for input; if the road cannot be detected, the current longitude and latitude C of the unmanned aerial vehicle is utilized x 、C y Degree of localization D 'with the last target point' x 、D' y And the direction theta' of the unmanned plane unit displacement D p And inputting and outputting the yaw control quantity to complete unmanned aerial vehicle control.
5) After the operation is finished, road detection is carried out below the current position of the unmanned aerial vehicle, and judgment is carried out again according to the detection condition.
6) The specific design of the eight-direction PID strategy controller is shown in fig. 6, and the eight-direction PID strategy controller is input for the first time to control the longitude and latitude C of the unmanned aerial vehicle x 、C y The next time the included angle alpha of the target point and the planned flight distance Dis are used as input quantities, the input quantity is input into an eight-direction flight controller, the yaw control quantity is obtained according to the relative position and the distance, the flight direction O and the flight time t are used as the input of the unmanned aerial vehicle, then the control interference omega is added, the input quantity is input into the unmanned aerial vehicle, and the unmanned aerial vehicle is controlled to fly; after the unmanned aerial vehicle flight stage is completed, the longitude and latitude C 'returned by the obtained unmanned aerial vehicle sensor is' x 、C' y Adding the measured interference v and the longitude and latitude C obtained by the theoretical flight result without adding the control interference sx 、C sy As input to the Kalman filter, and then an output C is obtained ax 、C ay And taking the longitude and latitude as the longitude and latitude of the unmanned aerial vehicle, and taking the mode selected by the road detection result as the input of the next controller until the destination is reached.
The Kalman filter formula is as follows:
state variables of the system:
X k =[V,V,C x ,C y ] Τ (9)
the state equation of the system:
X k =AX k-1 +BU k-1k-1 (10)
observation equation of system:
Z k =HX kk (11)
prediction stage:
X k =AX' k-1 +BU k-1 (12)
Figure BDA0002395671390000082
correction:
K k =P k H Τ (HP k H Τ +R) -1 (14)
X' k =X k +K k (Z k -HX k ) (15)
Figure BDA0002395671390000083
wherein A is a state transition matrix, B is a matrix for converting an input into a state, and P k Representing the a posteriori estimated covariance at time k,
Figure BDA0002395671390000081
representing a priori estimated covariance at time K, H is the state variable to measurement (observation) transition matrix representing the relationship connecting states and observations, K k Is Kalman filter coefficient omega k Representing process noise, assuming normally distributed zero-mean Gaussian white noise, with covariance matrix of Q, i.e. ω k ~N(0,Q);v k Representing process noise, assuming zero-mean Gaussian white noise with normal distribution, the covariance matrix of the noise is R, i.e. v k ~N(0,R)。
Finally obtained X' k C in (C) ax 、C ay As the last actual position of the unmanned aerial vehicle, the result is used as the state variable X of the next input k
In an embodiment, as shown in fig. 6, a method for detecting autonomous flying along a road by combining a multi-rotor unmanned aerial vehicle with the road comprises the following specific steps:
pretreatment: based on GPS data, recording the GPS value of the road capable of driving and storing the GPS value in a ground station database for comparison and calling;
s1: a control person performs task issuing on the unmanned aerial vehicle at a ground station, and a starting point and an ending point of a route are determined by planning the route of the unmanned aerial vehicle flying along the route;
s2: the image sensor on the unmanned aerial vehicle is fixed vertically downwards, and the image sensor of the unmanned aerial vehicle is connected and controlled through an onboard embedded mode so as to acquire video information right below the unmanned aerial vehicle;
s3: transmitting the video information acquired by the image sensor to a ground station through a 4G link;
s4: the extracted relative position and angle of the unmanned aerial vehicle and the road are utilized, and a GPS value returned by the unmanned aerial vehicle and a road GPS data set of the area are combined, a control instruction is sent to the unmanned aerial vehicle through a ground station, so that the unmanned aerial vehicle is controlled to fly above the road first;
s5: starting autonomous flight along the road; judging whether the road is in the square frame, if so, inputting the direction and the distance of the stage target point into a PID controller based on Kalman filtering; if not, inputting the direction and the distance of the correction position into a PID controller based on Kalman filtering; and after the unmanned aerial vehicle is finished, continuing to judge by using the position and the road detection result output by the controller until the unmanned aerial vehicle flies to the target point.
Finally, it is noted that the above embodiments are only for illustrating the technical solution of the present invention and not for limiting the same, and although the present invention has been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications and equivalents may be made thereto without departing from the spirit and scope of the present invention, which is intended to be covered by the claims of the present invention.

Claims (5)

1. A method for detecting autonomous flights along a road by combining a multi-rotor unmanned aerial vehicle with the road, which is characterized by comprising the following steps:
s1: the image sensor on the unmanned aerial vehicle is fixed vertically downwards, and the image sensor of the unmanned aerial vehicle is connected and controlled through an onboard embedded mode so as to acquire video information right below the unmanned aerial vehicle;
s2: transmitting the video information acquired by the image sensor to a ground station through a 4G link;
s3: the method comprises the steps of detecting a road through a ground station, and extracting the position and the angle of a road center line l in the whole video image;
s4: selecting an unmanned aerial vehicle mode by utilizing the relation between the extracted unmanned aerial vehicle and a road center line l and a decision box R, combining GPS information returned by the unmanned aerial vehicle and a road GPS data set of a region where the GPS information is located, inputting the GPS data set into an eight-direction PID strategy controller based on Kalman filtering to obtain a yaw control quantity, sending a control instruction to the unmanned aerial vehicle through a ground station, controlling the unmanned aerial vehicle to fly above a road at first, and controlling the unmanned aerial vehicle to fly along the road autonomously;
according to the processing result of the video image, the specific steps of obtaining the yaw control quantity by combining the eight-direction PID strategy controller based on Kalman filtering are as follows:
s41: the video decision box R decides the mode and the input quantity of the eight-direction PID strategy controller;
s42: the ground station judges the mode of the unmanned aerial vehicle according to whether the road center line l is in the judging box R;
(1) If the road center line l is in the decision box R, determining that the unmanned aerial vehicle is above the road; longitude and latitude C according to current actual position of unmanned aerial vehicle x 、C y Longitude and latitude P of the position of the road x 、P y And longitude and latitude D of target point of next stage of along-road flight x 、D y Calculating to obtain the direction angle and the distance between the unmanned aerial vehicle and the target point;
(2) If the road center line l is outside the decision box R, the unmanned aerial vehicle is offset from the road, the extracted road center line l and the longitude and latitude C of the current actual position of the unmanned aerial vehicle are utilized x 、C y Longitude and latitude D of target point at next stage of along-road flight x 、D y Calculating to obtain the direction angle and the flight distance of the unmanned aerial vehicle required to fly;
the direction angle and distance between the unmanned plane and the target point calculated according to the mode (1) or (2) are controlled by constant speed V and fixed eight directionsO is quantized; then, the value CP of the unmanned aerial vehicle position GPS is set x 、CP y And the control time t is used as the input of the controller to control the unmanned aerial vehicle; the obtained actual unmanned aerial vehicle feedback position C' x 、C' y From the theoretical position C sx 、C sy Simultaneously inputting the fusion position C into a Kalman filter to obtain the fusion position C of the unmanned aerial vehicle ax 、C ay As a next condition, the process advances to step S44;
s43: if the unmanned aerial vehicle cannot detect the road information below the unmanned aerial vehicle, firstly lifting the height of the unmanned aerial vehicle, and if the unmanned aerial vehicle can detect the road, selecting a mode for input according to the detected relative relation between the central line l of the road and the decision box R; if the road cannot be detected, the current longitude and latitude C of the unmanned aerial vehicle is utilized x 、C y Degree of localization D 'with the last target point' x 、D' y And the direction theta' of the unmanned plane unit displacement D p Inputting and outputting yaw control quantity to complete unmanned plane control;
s44: after the operation is finished, road detection is carried out below the current position of the unmanned aerial vehicle, and judgment is carried out again according to the detection condition until the unmanned aerial vehicle reaches a target place.
2. The method for detecting the autonomous fly along the road by combining the multi-rotor unmanned aerial vehicle with the road according to claim 1, wherein in the step S3, the ground station performs the road detection process on the obtained video image, which includes processing the road image information transmitted back by the unmanned aerial vehicle, and the specific outgoing process includes the following steps:
s31: firstly, calibrating a video obtained from a camera, and clearing lens distortion;
s32: carrying out graying treatment on the image;
s33: extracting information in the information by using a Canny edge detection algorithm;
s34: locating the pavement line by using Hough transformation to find out a straight line;
s35: and finding out a central line l of the road through K-means clustering, and recording an included angle theta.
3. The method for detecting the autonomous fly along the road by combining the multi-rotor unmanned aerial vehicle with the road according to claim 1, wherein in the step S41, the calculation formula of the video decision box R is as follows:
Figure FDA0004190842990000021
wherein ,RA For the width of the shot image, h is the height of the unmanned aerial vehicle from the ground, and k is the scaling factor.
4. The method for detecting the autonomous fly along the road by combining the multi-rotor unmanned aerial vehicle with the road according to claim 1, wherein in the step S42, if the road center line l is in the decision box R, the longitude and latitude C are determined according to the actual position of the unmanned aerial vehicle x 、C y Longitude and latitude P of the position of the road x 、P y The weights are combined to the origin by the following formula:
Figure FDA0004190842990000022
establishing a coordinate system taking north as a Y axis and east as an X axis, taking the position of a next target stage point in the road flight as a target point, and combining the longitude and latitude D of the target point x 、D y Obtaining a direction angle alpha and a distance Dis between the unmanned aerial vehicle and the target point:
Figure FDA0004190842990000023
5. the method for automatically flying the multi-rotor unmanned aerial vehicle along the road according to claim 1, wherein in the step S42, if the road center line l obtained by the road detection is outside the decision box R, the calculation formula of the direction angle and the flight distance of the unmanned aerial vehicle is:
Figure FDA0004190842990000031
wherein ,
Figure FDA0004190842990000032
/>
CN202010130574.6A 2020-02-28 2020-02-28 Method for detecting autonomous flight along road by combining multi-rotor unmanned aerial vehicle with road Active CN111309048B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010130574.6A CN111309048B (en) 2020-02-28 2020-02-28 Method for detecting autonomous flight along road by combining multi-rotor unmanned aerial vehicle with road

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010130574.6A CN111309048B (en) 2020-02-28 2020-02-28 Method for detecting autonomous flight along road by combining multi-rotor unmanned aerial vehicle with road

Publications (2)

Publication Number Publication Date
CN111309048A CN111309048A (en) 2020-06-19
CN111309048B true CN111309048B (en) 2023-05-26

Family

ID=71156647

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010130574.6A Active CN111309048B (en) 2020-02-28 2020-02-28 Method for detecting autonomous flight along road by combining multi-rotor unmanned aerial vehicle with road

Country Status (1)

Country Link
CN (1) CN111309048B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117389311B (en) * 2023-12-13 2024-04-09 北京御航智能科技有限公司 Unmanned aerial vehicle autonomous obstacle surmounting method and device, electronic equipment and storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103226354A (en) * 2013-02-27 2013-07-31 广东工业大学 Photoelectricity-navigation-based unmanned road recognition system
CN103500322A (en) * 2013-09-10 2014-01-08 北京航空航天大学 Automatic lane line identification method based on low-altitude aerial images
CN105318888A (en) * 2015-12-07 2016-02-10 北京航空航天大学 Unmanned perception based unmanned aerial vehicle route planning method
CN105389988A (en) * 2015-12-07 2016-03-09 北京航空航天大学 Multi-unmanned aerial vehicle cooperation highway intelligent inspection system
CN105450950A (en) * 2015-12-07 2016-03-30 北京航空航天大学 Method for removing jitter from aerial video of unmanned aerial vehicle
CN105549603A (en) * 2015-12-07 2016-05-04 北京航空航天大学 Intelligent road tour inspection control method for multi-rotor-wing unmanned aerial vehicle
CN107633568A (en) * 2017-09-11 2018-01-26 太仓史瑞克工业设计有限公司 A kind of Intelligent road patrol method and its system based on teaching unmanned plane
CN108198417A (en) * 2017-12-29 2018-06-22 叶片青 A kind of road cruising inspection system based on unmanned plane
CN108549208A (en) * 2018-03-14 2018-09-18 重庆邮电大学 A kind of quadrotor attitude control method based on factor adaptive fuzzy
CN110764526A (en) * 2018-07-25 2020-02-07 杭州海康机器人技术有限公司 Unmanned aerial vehicle flight control method and device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5137617B2 (en) * 2008-02-27 2013-02-06 富士重工業株式会社 Steering support device
US9025825B2 (en) * 2013-05-10 2015-05-05 Palo Alto Research Center Incorporated System and method for visual motion based object segmentation and tracking
US10618673B2 (en) * 2016-04-15 2020-04-14 Massachusetts Institute Of Technology Systems and methods for dynamic planning and operation of autonomous systems using image observation and information theory

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103226354A (en) * 2013-02-27 2013-07-31 广东工业大学 Photoelectricity-navigation-based unmanned road recognition system
CN103500322A (en) * 2013-09-10 2014-01-08 北京航空航天大学 Automatic lane line identification method based on low-altitude aerial images
CN105318888A (en) * 2015-12-07 2016-02-10 北京航空航天大学 Unmanned perception based unmanned aerial vehicle route planning method
CN105389988A (en) * 2015-12-07 2016-03-09 北京航空航天大学 Multi-unmanned aerial vehicle cooperation highway intelligent inspection system
CN105450950A (en) * 2015-12-07 2016-03-30 北京航空航天大学 Method for removing jitter from aerial video of unmanned aerial vehicle
CN105549603A (en) * 2015-12-07 2016-05-04 北京航空航天大学 Intelligent road tour inspection control method for multi-rotor-wing unmanned aerial vehicle
CN107633568A (en) * 2017-09-11 2018-01-26 太仓史瑞克工业设计有限公司 A kind of Intelligent road patrol method and its system based on teaching unmanned plane
CN108198417A (en) * 2017-12-29 2018-06-22 叶片青 A kind of road cruising inspection system based on unmanned plane
CN108549208A (en) * 2018-03-14 2018-09-18 重庆邮电大学 A kind of quadrotor attitude control method based on factor adaptive fuzzy
CN110764526A (en) * 2018-07-25 2020-02-07 杭州海康机器人技术有限公司 Unmanned aerial vehicle flight control method and device

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Quad Rotorcraft Switching Control: An Applicationfor the Task of Path Following;Luis Rodolfo García Carrillo etal.;《IEEE TRANSACTIONS ON CONTROL SYSTEMS TECHNOLOGY》;20141231;第1255-1267页 *
四旋翼飞行器追踪地面移动目标控制策略研究;黄瑞敏等;《电子技术应用》;20181231;第67-71、76页 *
基于OpenCV的自动循迹无人机;朱科风;《中国优秀硕士学位论文全文数据库工程科技Ⅱ辑》;20190615;第C031-75页 *
基于图像矩的视觉伺服飞行器悬停控制;华瑾等;《2018中国自动化大会(CAC2018)》;20181231;第710-715页 *

Also Published As

Publication number Publication date
CN111309048A (en) 2020-06-19

Similar Documents

Publication Publication Date Title
CN111652179B (en) Semantic high-precision map construction and positioning method based on point-line feature fusion laser
US11125566B2 (en) Method and apparatus for determining a vehicle ego-position
CN108132675B (en) Autonomous path cruising and intelligent obstacle avoidance method for factory inspection unmanned aerial vehicle
US9070289B2 (en) System and method for detecting, tracking and estimating the speed of vehicles from a mobile platform
EP2372308B1 (en) Image processing system and vehicle control system
CN104848867B (en) The pilotless automobile Combinated navigation method of view-based access control model screening
KR101839599B1 (en) Road facility surveying system using drone
CN111492326A (en) Image-based positioning for unmanned aerial vehicles and related systems and methods
CN104808685A (en) Vision auxiliary device and method for automatic landing of unmanned aerial vehicle
AU2014203381A1 (en) Systems and methods for autonomous landing using a three dimensional evidence grid
CN109460046B (en) Unmanned aerial vehicle natural landmark identification and autonomous landing method
EP3989034B1 (en) Automatic safe-landing-site selection for unmanned aerial systems
CN102788579A (en) Unmanned aerial vehicle visual navigation method based on SIFT algorithm
CN113223064B (en) Visual inertial odometer scale estimation method and device
CN111309048B (en) Method for detecting autonomous flight along road by combining multi-rotor unmanned aerial vehicle with road
CN114689030A (en) Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision
CN111089580B (en) Unmanned war chariot simultaneous positioning and map construction method based on covariance intersection
RU2513900C1 (en) Method and device to determine object coordinates
Subramanian et al. Integrating computer vision and photogrammetry for autonomous aerial vehicle landing in static environment
CN113721188B (en) Multi-unmanned aerial vehicle self-positioning and target positioning method under refusing environment
CN113312403B (en) Map acquisition method and device, electronic equipment and storage medium
Ramirez et al. Moving target acquisition through state uncertainty minimization
Naumenko et al. Information-Extreme Machine Learning of an On-board Ground Object Recognition System with a Choice of a Base Recognition Class.
Son et al. Recognition of the Shape and Location of Multiple Power Lines Based on Deep Learning With Post-Processing
US20230023069A1 (en) Vision-based landing system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant