CN107783119A - Apply the Decision fusion method in obstacle avoidance system - Google Patents

Apply the Decision fusion method in obstacle avoidance system Download PDF

Info

Publication number
CN107783119A
CN107783119A CN201610724576.1A CN201610724576A CN107783119A CN 107783119 A CN107783119 A CN 107783119A CN 201610724576 A CN201610724576 A CN 201610724576A CN 107783119 A CN107783119 A CN 107783119A
Authority
CN
China
Prior art keywords
data
height
barrier
unmanned plane
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610724576.1A
Other languages
Chinese (zh)
Inventor
田雨农
王鑫照
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian Roiland Technology Co Ltd
Original Assignee
Dalian Roiland Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian Roiland Technology Co Ltd filed Critical Dalian Roiland Technology Co Ltd
Priority to CN201610724576.1A priority Critical patent/CN107783119A/en
Publication of CN107783119A publication Critical patent/CN107783119A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/933Radar or analogous systems specially adapted for specific applications for anti-collision purposes of aircraft or spacecraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

A kind of Decision fusion method applied in obstacle avoidance system, including decision-making level make following processing:P1, the relative distance of unmanned plane and barrier is first determined whether, relative distance is carried out to the division of three parts:Less than N1m, three distance ranges of N1m to N2m, N2m to N3m;P2, after the completion of distance division, the division of danger classes is carried out according to the relative velocity of unmanned plane and barrier:P3, then judge unmanned plane and the height value on ground, height value H is carried out to the division of four grades;P4:For danger classes, it is necessary to carry out step P3 operation;For warning level, it is necessary to carry out step P3 operation after emergency deceleration;For cue scale and uncorrelated grade, without carrying out the judgement of third step, return re-starts detection.The application obtains more accurate barrier data message, and then makes more accurately avoidance decision-making and make avoidance obstacle.

Description

Apply the Decision fusion method in obstacle avoidance system
Technical field
The invention belongs to unmanned plane avoidance technical field, more particularly to a kind of Decision fusion side applied in obstacle avoidance system Method.
Background technology
Unmanned air vehicle technique quickly turns into the new focus researched and developed both at home and abroad in recent years, and due to unmanned equipment Have high maneuverability, operation flexibly, cost is low, image real-time Transmission and the features such as high-resolution so that unmanned plane is applied to Social every field, such as disaster assistance, electric inspection process, forestry fire prevention, agricultural spray, vegetation protection, photography of taking photo by plane.
At Post disaster relief scene, because traditional means have many limitations, unmanned air vehicle technique has obtained gradual development. Post disaster relief unmanned plane mainly after Disaster Event generation, for bad environments, can not understand field conditions in time and rescue Help it is urgent in the case of, rescue site can be intervened from air observation with most fast, most easily means.Unmanned plane passes through high definition Situation live after calamity is shot and recorded by camera, and then the real-time live collected is taken photo by plane data back.Will Unmanned plane is applied to Post disaster relief, can not only avoid the personal safety of aircrew, can also be rescue personnel at first Between view disaster field situation, arrange the work such as treatment after calamity.
Because site environment complexity is unknown after calamity, for unmanned plane in shooting and recording after realizing calamity, may lead Situations such as causing unmanned plane to collide, causes unmanned plane to damage, and delays the quick understanding of live disaster relief situation, therefore just need During carrying out Post disaster relief, ensure the flight safety of unmanned plane.
The content of the invention
The present invention proposes a kind of Decision fusion method applied in obstacle avoidance system, to obtain more accurate barrier Data message, and then make more accurately avoidance decision-making and make avoidance obstacle.
The invention provides a kind of Decision fusion method applied in obstacle avoidance system, including decision-making level to make following place Reason:
P1, the relative distance of unmanned plane and barrier is first determined whether, relative distance is carried out to the division of three parts:It is less than N1m, N1m to N2m, three distance ranges of N2m to N3m;
P2, after the completion of distance division, the division of danger classes is carried out according to the relative velocity of unmanned plane and barrier:
P3, then judge unmanned plane and the height value on ground, height value H is carried out to the division of four grades;
P4:For danger classes, it is necessary to carry out step P3 operation;For warning level, it is necessary to be carried out after emergency deceleration Step P3 operation;For cue scale and uncorrelated grade, without carrying out the judgement of third step, return re-starts inspection Survey.
Further, the division of step P2 danger classes is specially:
When distance is less than N1m, speed is more than M1m/s, and pre-warning time is less than Qs, then belongs to danger classes, and speed is less than During M1m/s, belong to warning level;
As distance N1m≤R<N2m, when speed is more than M2m/s, in danger classes;As speed M1m/s≤V<During M2m/s, In warning level, when speed is less than M1m/s, in cue scale;
As distance N2m≤R<During N3m, when speed is more than M3m/s, in danger classes;As speed M2m/s≤V<M3m/ During s, in warning level, as speed M1m/s≤V<During M2m/s, in cue scale, when speed is less than M1m/s, in not Associated ratings;
Further, the division of tetra- grades of step P3 is specially:
When height is less than X1m, barrier includes wall, trees and people, according to binocular vision sensor to barrier Attribute is recognized, and distinguishes wall, trees, people, then carries out liter of climbing after the urgent hovering of selection, by the process of climbing, for People can carry out avoidance completely, but need to carry out further height for trees and wall and judge;
As height X1m≤H<X2m, in this height, barrier includes wall, trees, according to binocular vision sensor pair The attribute of barrier is recognized, and distinguishes wall, trees, then carries out liter of climbing after the urgent hovering of selection, by the process of climbing, Avoidance can be carried out completely for trees, but is needed to carry out further height for wall and judged;
As height X2m≤H<X3m, in this height, barrier includes wall and high-voltage line, according to vision collecting sensor The attribute of barrier is recognized, distinguishes wall and high-voltage line, then liter of climbing is carried out after the urgent hovering of selection, by climbing Process, avoidance can be carried out completely for trees, but need to continue highly to judge for wall;
As height H >=X3m, the attribute of barrier is recognized according to binocular vision sensor in this height, such as When fruit confirms still metope, avoidance of turning back is carried out after selecting urgent hovering.
Further, the above method also includes:The data that data aggregation layer collects to each sensor are handled:
1) millimetre-wave radar sensor output data is the relative distance R1, relative velocity V1, barrier of unmanned plane and barrier Hinder the angle between thing and radar normal, including azimuth angle theta 1 and angle of pitch ψ 1;
2) the relative distance R2 of ultrasonic radar sensor input unmanned plane and barrier;
3) binocular vision sensor output bulk area S, azimuth angle theta 2 and relative distance R3;
4) radar altitude sensor output unmanned plane and the height value R4 on ground;
5) GPS/ Big Dippeves alignment sensor output time T, positioning states S, A are positioning, and V is no-fix, north latitude N or south latitude S, east longitude longitude E or west longitude longitude W, unmanned plane speed V2;
6) AHRS modules output three-dimensional acceleration A_x, A_y, A_z, three-dimensional angular velocity w_x, w_y, w_z and three-dimensional earth's magnetic field Intensity m_x, m_y, m_z, go out the current attitude data of unmanned plane, i.e. flight azimuth θ 3, angle of pitch ψ by these data calculations 2 and roll angleSpecially:
Further, the above method also includes:Characteristic layer carries out unmanned plane and data fusion, the nothing of barrier relative distance The data fusion and acquisition barrier of the relative velocity of the man-machine and data fusion of ground relative altitude, unmanned plane and barrier Size, the attributive character such as shape.
Further, unmanned plane and the data fusion of barrier relative distance are handled according to distance range:
A, distance is in the range of 0m to 10m, ultrasonic radar sensor, binocular vision sensor and millimetre-wave radar sensing Device is detected, and introduces α, β weighted value is to ultrasonic radar sensor, binocular vision sensor and millimetre-wave radar sensor It is weighted averagely, the data after Weighted Fusion is subjected to kalman data fusions;
B, in the range of 10m to 20m, the weighted value for introducing α senses distance to binocular vision sensor and millimetre-wave radar The two sensors of device are weighted averagely, and the data after Weighted Fusion are carried out into kalman data fusions;
C, distance carries out kalman data fusions in the range of 50m to the data of millimetre-wave radar sensor.
Further, according to different actual ranges, the distance that each sensor collection calculates is as follows:
Further, unmanned plane and the data fusion of ground relative altitude are to radar altitude sensor and the GPS/ Big Dippeves Unmanned plane height value acquired in alignment sensor carries out data fusion;The data fusion of height value is divided into two kinds according to distance;
In the range of being less than 100m for height, using radar altitude sensor and GPS/ Big Dipper alignment sensors to nobody Machine is highly detected, and the result after detection uses weighted average, that is, introduces α values and the height value of both sensors is added Weight average processing, data fusion is carried out to height value using kalman after processing;Height value is carried out according to AHRS navigation attitudes data Amendment;
It is more than more than 100m for height, using GPS/ Big Dipper alignment sensors, the altitude information of acquisition is directly carried out Kalman data fusions, then, height correction is carried out using AHRS navigation attitudes data;
Wherein H1 is the height of radar altitude sensor collection, and H2 is the height of GPS/ Big Dippeves alignment sensor collection;Together When barrier identification carried out to features such as the size of barrier, shapes according to binocular vision sensor.
As further, carrying out height correction using AHRS navigation attitudes data is specially:
ψ 2 be the angle of pitch andFor roll angle;H is measurement height, and H ' is revised height value.
As further, the data fusion of unmanned plane and barrier relative velocity using millimetre-wave radar sensor and GPS/ Big Dippeves alignment sensor obtains, and for the speed data of the two sensors, is weighted the processing of average value, i.e.,
V=α × V1+ (1- α) × V2, α is the weight ratio of the two sensors, the speed data after weighted mean is handled Kalman data fusions are carried out, barrier identification is carried out to features such as the size of barrier, shapes according to binocular vision sensor.
The present invention can obtain following technique effect due to using above technical scheme:The application can be preferably real Existing perception of the Post disaster relief rotor wing unmanned aerial vehicle to Post disaster relief scene complex environment, obtain more accurate barrier data letter Breath, makes more accurately avoidance decision-making and makes avoidance obstacle;Data fusion is in the multiple similar of diverse location or not The deficiency of data for the local environment that same type of sensor is provided is integrated, and eliminates redundancy that may be present and lance between sensor The data of shield, are subject to complementation, reduce its uncertainty, are described with the relatively complete consistent perception formed to system environments, from And the decision-making, planning, the rapidity and correctness of reflection of intelligence system are improved, reduce risk of policy making.
For measuring the distance between unmanned plane and barrier, ultrasonic ranging, binocular identification and millimetre-wave radar pass Sensor, three kinds of sensors have the advantages of respective and deficiency in measurement distance, and three rationally, is effectively combined, then can be to phase Effective complementary, the accurate acquisition of raising measurement distance data is realized in the measurement adjusted the distance.Similarly, for unmanned plane and barrier Relative velocity and unmanned plane and ground level value be all advantage and disadvantage using each sensor, make up mutually, carry out data Fusion, to improve data precision, improve system reliability.
Brief description of the drawings
, below will be to embodiment or existing for clearer explanation embodiments of the invention or the technical scheme of prior art There is the required accompanying drawing used in technology description to do one and simply introduce, it should be apparent that, drawings in the following description are only Some embodiments of the present invention, for those of ordinary skill in the art, on the premise of not paying creative work, may be used also To obtain other accompanying drawings according to these accompanying drawings.
Fig. 1 is check device structured flowchart in embodiment 1;
System structure diagram in data fusion methods of the Fig. 2 between unmanned plane and barrier;
Fig. 3 is unmanned plane and the Data Fusion Structure schematic diagram of barrier relative distance;
Fig. 4 is unmanned plane and the Data Fusion Structure schematic diagram of ground relative altitude;
Fig. 5 is the Data Fusion Structure schematic diagram of unmanned plane and barrier relative velocity;
Fig. 6 is decision-making level's structural representation;
Fig. 7 is the flow chart of decision-making level in embodiment.
Embodiment
To make the purpose, technical scheme and advantage of embodiments of the invention clearer, with reference to the embodiment of the present invention In accompanying drawing, the technical scheme in the embodiment of the present invention is clearly completely described:
The targeted unmanned plane of the application, mainly more rotor Post disaster relief rotor wing unmanned aerial vehicles.Multi-rotor unmanned aerial vehicle utilizes Its main rotor circumgyration incision air produces flying power, compared with fixed-wing, has standing start, hovering, flying speed Slowly the big, flexibility ratio of, bearing a heavy burden it is high and can hedgehopping the characteristics of.Multi-rotor unmanned aerial vehicle do not need runway can with VTOL, It can be hovered in the air after taking off, the application after suitable calamity in complex environment.Its manipulating principle is simple, four remote sensing behaviour of control device Around making corresponding aircraft, up and down and yaw direction motion.
Embodiment 1
A kind of Decision fusion method applied in obstacle avoidance system is present embodiments provided, including:Data aggregation layer, feature Layer, decision-making level and detection means;
The detection means, including:
Radar altitude sensor, vertical range of the measurement unmanned plane to ground;
GPS/ Big Dipper alignment sensors, are positioned in real time, to realize the tasks such as the spot hover of unmanned plane, and can be with Realize the measurement of unmanned plane height and the measurement of unmanned plane relative velocity;
AHRS modules, gather the flight attitude and sail information of unmanned plane;AHRS modules, include MEMS three axis accelerometer Instrument, accelerometer and magnetometer, output data are three-dimensional acceleration, three-dimensional angular velocity and three-dimensional geomagnetic field intensity.
Millimetre-wave radar sensor, using the system of linear frequency modulation triangular wave, for realizing barrier to the remote of unmanned plane Range measurement;The millimetre-wave radar sensor, including:Anneta module, form the transmitting needed for radar detection and receive wave beam; To all directions area of space transmission signal, and the barrier scatter echo signal in reception space region;RF front-end module, According to the application scenarios and functional requirement of unmanned plane avoidance millimetre-wave radar, the transmitting and reception processing of signal are realized;At base band Manage module, control transmitting modulation waveform, signal acquisition and signal transacting work, by the relative distance of objects ahead barrier, Relative velocity and azimuth are parsed, and are sent to master controller, so as to complete millimetre-wave radar sensor to target disorders The data acquisition transmission work of thing.Millimetre-wave radar sensor carries out the ranging in the range of 1~50m.
Ultrasonic radar sensor, for realizing close-in measurement of the barrier to unmanned plane, ultrasonic radar sensor Carry out the ranging of 0~10 meter of scope;
Binocular vision sensor, measure the size and shape of barrier;
Master controller, analyzed by the data obtained to each sensor, control unmanned plane completes avoidance action;
The master controller respectively with radar altitude sensor, GPS/ Big Dippeves alignment sensor, AHRS modules, millimeter wave thunder It is connected up to sensor, ultrasonic radar sensor with binocular vision sensor.
Preferably, the millimetre-wave radar, ultrasonic radar, binocular vision sensor are four, are separately mounted to nothing The man-machine face of front, rear, left and right four.Because multi-rotor unmanned aerial vehicle can all around fly, it is necessary to which each face will be carried out Anticollision designs, so carrying out long range measurements, a ultrasonic wave thunder by a millimetre-wave radar sensor on each face Close-in measurement is carried out up to sensor, a binocular vision sensor carries out objective attribute target attribute, such as orientation, and size shape is surveyed Amount, so Post disaster relief rotor wing unmanned aerial vehicle designed by the present invention is for millimetre-wave radar sensor, ultrasonic sensor and double Mesh vision sensor respectively needs four.
Embodiment 2
As the further restriction to embodiment 1:The data that data aggregation layer collects to each sensor are handled:
1) millimetre-wave radar sensor output data is the relative distance R1, relative velocity V1, barrier of unmanned plane and barrier Hinder the angle between thing and radar normal, including azimuth angle theta 1 and angle of pitch ψ 1;
2) the relative distance R2 of ultrasonic radar sensor input unmanned plane and barrier;
3) binocular vision sensor output bulk area S, azimuth angle theta 2 and relative distance R3;
4) radar altitude sensor output unmanned plane and the height value R4 on ground;
5) GPS/ Big Dippeves alignment sensor is mainly the height above sea level H2 and horizontal speed of a ship or plane V2 for obtaining unmanned plane;
Gps data follows NMEA0183 agreements, and the information exported is all that standard has a set form.Wherein with nothing Man-machine navigation it is closely related have GPGGA and GPVTG sentences.Their data format provides as follows:
(1) $ GPGGA, UTC time, latitude, latitude hemisphere, longitude, longitude hemisphere, GPS patterns, star number, the horizontal essences of HDOP Spend the factor, height above sea level, M, the height of earth ellipsoid face horizontal plane relative to the earth, M, Differential time, difference station ID*hh<CR>< LF>。
(2) $ GPVTG, the ground course on the basis of geographical north, T, the ground course on the basis of magnetic north, M, ground speed (section), N, ground speed (kilometer/hour), K, pattern instruction * hh<CR><LF>.
By extracting in GPGGA sentences, the altitude data of ad-hoc location, the height above sea level of unmanned plane can be obtained H2, by extracting in GPVTG sentences, the ground speed (kilometer/hour) of ad-hoc location, it is possible to obtain the horizontal boat of unmanned plane Fast V2.
6) AHRS modules output three-dimensional acceleration A_x, A_y, A_z, three-dimensional angular velocity w_x, w_y, w_z and three-dimensional earth's magnetic field Intensity m_x, m_y, m_z, go out the current attitude data of unmanned plane, i.e. flight azimuth θ 3, angle of pitch ψ by these data calculations 2 and roll angle
Embodiment 3
As the supplement to embodiment 1 or 2, the data fusion of the relative distance of characteristic layer progress unmanned plane and barrier, The data fusion and acquisition obstacle of the relative velocity of unmanned plane and the data fusion of ground relative altitude, unmanned plane and barrier The attributive character such as the size of thing, shape;
Unmanned plane and the data fusion of barrier relative distance are handled according to distance range:
A, distance is in the range of 0m to 10m, ultrasonic radar sensor, binocular vision sensor and millimetre-wave radar sensing Device is detected, but the relative accuracy of these radars is different, and in short range, the precision of ultrasonic wave is higher, But in order to improve the resolving accuracy to height, using weighted average, that is, α is introduced, β weighted value senses to ultrasonic radar Device, binocular vision sensor and millimetre-wave radar sensor are weighted averagely, and the data after Weighted Fusion are carried out into kalman Data fusion;
B, distance is in the range of 10m to 20m, beyond ultrasonic radar finding range, but vision sensor and milli Metre wave radar can also be detected, so in the distance range, using weighted average, that is, introduce α weighted value to binocular Vision sensor and millimetre-wave radar sensor the two sensors are weighted average, and the data after Weighted Fusion are carried out Kalman data fusions;
C, distance is in the range of 50m, beyond ultrasonic radar and vision sensor finding range, but millimeter wave thunder Up to can also be detected, so in the distance range, without using weighting algorithm, directly to millimetre-wave radar sensor Data carry out kalman data fusions;
It is i.e. as follows
Unmanned plane and the data fusion of ground relative altitude are to radar altitude sensor and GPS/ Big Dipper alignment sensors Acquired unmanned plane height value carries out data fusion;The data fusion of height value is divided into two kinds according to distance;For highly small In the range of 100m, unmanned plane is highly detected using radar altitude sensor and GPS/ Big Dippeves alignment sensor, examined Result after survey uses weighted average, that is, introduces α values and be weighted average treatment to the height value of both sensors, after processing Data fusion is carried out to height value using kalman;Due to unmanned plane height value and UAV Attitude have it is close associate, institute To need to be modified height value according to AHRS navigation attitude data:
It is more than more than 100m for height, the altitude information that only positioning of the GPS/ Big Dippeves obtains is accurate, so in the height GPS/ Big Dipper alignment sensors are used in the range of degree, the altitude information of acquisition is directly subjected to kalman data fusions, then, adopted Height correction is carried out with AHRS navigation attitudes data;
Wherein H1 is the height of radar altitude sensor collection, and H2 is the height of GPS/ Big Dippeves alignment sensor collection;Together When barrier identification carried out to features such as the size of barrier, shapes according to binocular vision sensor.Discrimination method can use The modes such as the methods of artificial intelligence, such as pattern-recognition, neural network algorithm.
Alternatively, carrying out height correction using AHRS navigation attitudes data is specially:
ψ 2 be the angle of pitch andFor roll angle;H is measurement height, and H ' is revised height value.
The data fusion of the relative velocity of unmanned plane and barrier is positioned using millimetre-wave radar sensor and the GPS/ Big Dippeves Sensor obtains, and for the speed data of the two sensors, is weighted the processing of average value, i.e.,
V=α × V1+ (1- α) × V2, α is the weight ratio of the two sensors, the speed data after weighted mean is handled Kalman data fusions are carried out, so as to obtain more accurate relative velocity data;According to binocular vision sensor to barrier Size, the feature such as shape carry out barrier identification.Discrimination method can use the methods of artificial intelligence, such as pattern-recognition, god Through modes such as network algorithms.
Embodiment 4
As the supplement to embodiment 1 or 2 or 3, decision-making level completes avoidance as follows:
P1, the relative distance of unmanned plane and barrier is first determined whether, relative distance is carried out to the division of three parts:It is less than N1m, N1m to N2m, three distance ranges of N2m to N3m;
P2, after the completion of distance division, the division of danger classes is carried out according to the relative velocity of unmanned plane and barrier:
When distance is less than 10m, speed is more than 3m/s, and pre-warning time is less than 3s, then belongs to danger classes, and speed is less than During 3m/s, belong to warning level;
As distance 10m≤R<20m, when speed is more than 6m/s, in danger classes;As speed 3m/s≤V<During 6m/s, place In warning level, when speed is less than 3m/s, in cue scale;
As distance 20m≤R<During 50m, when speed is more than 16m/s, in danger classes;As speed 6m/s≤V<16m/s When, in warning level, as speed 3m/s≤V<During 6m/s, in cue scale, when speed is less than 3m/s, in uncorrelated Grade;
P3, then judge unmanned plane and the height value on ground, height value H is carried out to the division of four grades;
When height is less than X1m, in this height, main barrier includes wall, trees and people, according to binocular vision Feel that sensor recognizes to the attribute of barrier, distinguish wall, trees, people, then carry out liter of climbing after the urgent hovering of selection, By the process of climbing, avoidance can be carried out completely for people, but need to carry out further height for trees and wall and judge;
As height X1m≤H<X2m, in this height, main barrier includes wall, trees, is passed according to binocular vision Sensor recognizes to the attribute of barrier, distinguishes wall, trees, then liter of climbing is carried out after the urgent hovering of selection, by climbing The process of liter, avoidance can be carried out completely for trees, but need to carry out further height for wall and judge;
As height X2m≤H<X3m, in this height, main barrier includes wall and high-voltage line, according to vision collecting Sensor recognizes to the attribute of barrier, distinguishes wall and high-voltage line, then carries out liter of climbing after the urgent hovering of selection, leads to The process of climbing is crossed, avoidance can be carried out completely for trees, but need to continue highly to judge for wall;
As height H >=X3m, the attribute of barrier is recognized according to binocular vision sensor in this height, such as When fruit confirms still metope, avoidance of turning back is carried out after selecting urgent hovering;
P4:For danger classes, it is necessary to carry out step P3 operation;For warning level, it is necessary to be carried out after emergency deceleration Step P3 operation;For cue scale and uncorrelated grade, without carrying out the judgement of third step, return re-starts inspection Survey.
The foregoing is only a preferred embodiment of the present invention, but protection scope of the present invention be not limited thereto, Any one skilled in the art in the technical scope of present disclosure, technique according to the invention scheme and its Inventive concept is subject to equivalent substitution or change, should all be included within the scope of the present invention.

Claims (10)

1. apply the Decision fusion method in obstacle avoidance system, it is characterised in that make following processing including decision-making level:
P1, the relative distance of unmanned plane and barrier is first determined whether, relative distance is carried out to the division of three parts:Less than N1m, Three distance ranges of N1m to N2m, N2m to N3m;
P2, after the completion of distance division, the division of danger classes is carried out according to the relative velocity of unmanned plane and barrier:
P3, then judge unmanned plane and the height value on ground, height value H is carried out to the division of four grades;
P4, for danger classes, it is necessary to carry out step P3 operation;For warning level, it is necessary to carry out step after emergency deceleration P3 operation;For cue scale and uncorrelated grade, without carrying out the judgement of third step, return re-starts detection.
2. the Decision fusion method in obstacle avoidance system is applied according to claim 1, it is characterised in that step P2 danger etc. Level division be specially:
When distance is less than N1m, speed is more than M1m/s, and pre-warning time is less than Qs, then belongs to danger classes, and speed is less than M1m/s When, belong to warning level;
As distance N1m≤R<N2m, when speed is more than M2m/s, in danger classes;As speed M1m/s≤V<During M2m/s, it is in Warning level, when speed is less than M1m/s, in cue scale;
As distance N2m≤R<During N3m, when speed is more than M3m/s, in danger classes;As speed M2m/s≤V<During M3m/s, In warning level, as speed M1m/s≤V<During M2m/s, in cue scale, when speed is less than M1m/s, in uncorrelated Grade.
3. the Decision fusion method in obstacle avoidance system is applied according to claim 1, it is characterised in that step P3 tetra- etc. Level division be specially:
When height is less than X1m, barrier includes wall, trees and people, the attribute according to binocular vision sensor to barrier Recognized, distinguish wall, trees, people, then liter of climbing is carried out after the urgent hovering of selection, by the process of climbing, for people's energy Avoidance is carried out completely, but is needed to carry out further height for trees and wall and judged;
As height X1m≤H<X2m, in this height, barrier includes wall, trees, according to binocular vision sensor to obstacle The attribute of thing is recognized, and distinguishes wall, trees, then carries out liter of climbing after the urgent hovering of selection, by the process of climbing, for Trees can carry out avoidance completely, but need to carry out further height for wall and judge;
As height X2m≤H<X3m, in this height, barrier includes wall and high-voltage line, according to vision collecting sensor to barrier Hinder the attribute of thing to be recognized, distinguish wall and high-voltage line, then liter of climbing is carried out after the urgent hovering of selection, by climbing Journey, avoidance can be carried out completely for trees, but need to continue highly to judge for wall;
As height H >=X3m, the attribute of barrier is recognized according to binocular vision sensor in this height, if really When recognizing still metope, avoidance of turning back is carried out after selecting urgent hovering.
4. the Decision fusion method in obstacle avoidance system is applied according to claim 1, it is characterised in that the above method also wraps Include:The data that data aggregation layer collects to each sensor are handled:
1) millimetre-wave radar sensor output data is relative distance R1, relative velocity V1, the barrier of unmanned plane and barrier With the angle between radar normal, including azimuth angle theta 1 and angle of pitch ψ 1;
2) the relative distance R2 of ultrasonic radar sensor input unmanned plane and barrier;
3) binocular vision sensor output bulk area S, azimuth angle theta 2 and relative distance R3;
4) radar altitude sensor output unmanned plane and the height value R4 on ground;
5) GPS/ Big Dippeves alignment sensor output time T, positioning states S, A are positioning, and V is no-fix, north latitude N or south latitude S, east Through longitude E or west longitude longitude W, unmanned plane speed V2;
6) AHRS modules output three-dimensional acceleration A_x, A_y, A_z, three-dimensional angular velocity w_x, w_y, w_z and three-dimensional geomagnetic field intensity M_x, m_y, m_z, go out the current attitude data of unmanned plane, i.e. flight azimuth θ 3, the and of angle of pitch ψ 2 by these data calculations Roll angleSpecially:
5. according to the Decision fusion method applied in obstacle avoidance system of claim 1 or 4, it is characterised in that the above method Also include:The data of the data fusion of characteristic layer progress unmanned plane and barrier relative distance, unmanned plane and ground relative altitude The attributive character such as the data fusion of the relative velocity of fusion, unmanned plane and barrier and the size of acquisition barrier, shape.
6. the Decision fusion method in obstacle avoidance system is applied according to claim 5, it is characterised in that unmanned plane and obstacle The data fusion of thing relative distance is handled according to distance range:
A, in the range of 0m to 10m, ultrasonic radar sensor, binocular vision sensor and millimetre-wave radar sensor enter distance Row detection, introduces α, and β weighted value is carried out to ultrasonic radar sensor, binocular vision sensor and millimetre-wave radar sensor Weighted average, the data after Weighted Fusion are subjected to kalman data fusions;
B, distance is in the range of 10m to 20m, introduce α weighted value to binocular vision sensor and millimetre-wave radar sensor this Two sensors are weighted averagely, and the data after Weighted Fusion are carried out into kalman data fusions;
C, distance carries out kalman data fusions in the range of 50m to the data of millimetre-wave radar sensor.
7. the Decision fusion method in obstacle avoidance system is applied according to claim 6, it is characterised in that according to different realities Border distance, the distance that each sensor collection calculates are as follows:
8. the Decision fusion method in obstacle avoidance system is applied according to claim 5, it is characterised in that unmanned plane and ground The data fusion of relative altitude is to the unmanned plane height value acquired in radar altitude sensor and GPS/ Big Dipper alignment sensors Carry out data fusion;The data fusion of height value is divided into two kinds according to distance;
In the range of being less than 100m for height, using radar altitude sensor and GPS/ Big Dippeves alignment sensor to unmanned plane height Degree is detected, and the result after detection uses weighted average, i.e., introducing α values are weighted flat to the height value of both sensors Handle, data fusion is carried out to height value using kalman after processing;Height value is modified according to AHRS navigation attitude data;
It is more than more than 100m for height, using GPS/ Big Dipper alignment sensors, the altitude information of acquisition is directly carried out Kalman data fusions, then, height correction is carried out using AHRS navigation attitudes data;
Wherein H1 is the height of radar altitude sensor collection, and H2 is the height of GPS/ Big Dippeves alignment sensor collection;Root simultaneously Barrier identification is carried out to features such as the size of barrier, shapes according to binocular vision sensor.
9. the Decision fusion method in obstacle avoidance system is applied according to claim 8, it is characterised in that using AHRS navigation attitudes Data carry out height correction:
ψ 2 be the angle of pitch andFor roll angle;H is measurement height, and H ' is revised height value.
10. the Decision fusion method in obstacle avoidance system is applied according to claim 5, it is characterised in that unmanned plane and barrier The data fusion of thing relative velocity is hindered to be obtained using millimetre-wave radar sensor and GPS/ Big Dippeves alignment sensor, for the two The speed data of sensor, is weighted the processing of average value, i.e. V=α × V1+ (1- α) × V2, α is the two sensors Weight ratio, the speed data after weighted mean is handled carries out kalman data fusions, according to binocular vision sensor to obstacle The features such as the size of thing, shape carry out barrier identification.
CN201610724576.1A 2016-08-25 2016-08-25 Apply the Decision fusion method in obstacle avoidance system Pending CN107783119A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610724576.1A CN107783119A (en) 2016-08-25 2016-08-25 Apply the Decision fusion method in obstacle avoidance system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610724576.1A CN107783119A (en) 2016-08-25 2016-08-25 Apply the Decision fusion method in obstacle avoidance system

Publications (1)

Publication Number Publication Date
CN107783119A true CN107783119A (en) 2018-03-09

Family

ID=61438590

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610724576.1A Pending CN107783119A (en) 2016-08-25 2016-08-25 Apply the Decision fusion method in obstacle avoidance system

Country Status (1)

Country Link
CN (1) CN107783119A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018166287A1 (en) * 2017-03-14 2018-09-20 北京京东尚科信息技术有限公司 Unmanned aerial vechicle positioning method and apparatus
CN108803666A (en) * 2018-09-11 2018-11-13 国网电力科学研究院武汉南瑞有限责任公司 A kind of line data-logging unmanned plane barrier-avoiding method and system based on millimetre-wave radar
CN109218983A (en) * 2018-06-28 2019-01-15 中国人民解放军国防科技大学 Positioning method and positioning system
CN109254526A (en) * 2018-09-06 2019-01-22 南京航空航天大学 A kind of multilevel security redundancy control system hanging voluntarily transport trolley
CN109407662A (en) * 2018-08-31 2019-03-01 百度在线网络技术(北京)有限公司 Automatic driving vehicle control method and device
CN111061279A (en) * 2020-01-03 2020-04-24 山东大学 Indoor self-adaptive cruise control system and method for electric sickbed
CN111522346A (en) * 2020-05-07 2020-08-11 国网四川省电力公司电力科学研究院 Intelligent obstacle avoidance method based on deep learning

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2441643A (en) * 2006-09-05 2008-03-12 Honeywell Int Inc A collision avoidance system for unmanned aircraft
CN102707724A (en) * 2012-06-05 2012-10-03 清华大学 Visual localization and obstacle avoidance method and system for unmanned plane
CN103135550A (en) * 2013-01-31 2013-06-05 南京航空航天大学 Multiple obstacle-avoidance control method of unmanned plane used for electric wire inspection
CN103224026A (en) * 2012-12-05 2013-07-31 福建省电力有限公司 Special-purpose unmanned helicopter obstacle-avoidance system for mountain-area electrical network routing inspection and work flow thereof
CN104820429A (en) * 2015-04-28 2015-08-05 南京航空航天大学 Ultrasonic distance detection-based unmanned aerial vehicle obstacle avoidance system and control method thereof
CN105222760A (en) * 2015-10-22 2016-01-06 一飞智控(天津)科技有限公司 The autonomous obstacle detection system of a kind of unmanned plane based on binocular vision and method
CN105892489A (en) * 2016-05-24 2016-08-24 国网山东省电力公司电力科学研究院 Multi-sensor fusion-based autonomous obstacle avoidance unmanned aerial vehicle system and control method
CN107783106A (en) * 2016-08-25 2018-03-09 大连楼兰科技股份有限公司 Data fusion method between unmanned plane and barrier

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2441643A (en) * 2006-09-05 2008-03-12 Honeywell Int Inc A collision avoidance system for unmanned aircraft
CN102707724A (en) * 2012-06-05 2012-10-03 清华大学 Visual localization and obstacle avoidance method and system for unmanned plane
CN103224026A (en) * 2012-12-05 2013-07-31 福建省电力有限公司 Special-purpose unmanned helicopter obstacle-avoidance system for mountain-area electrical network routing inspection and work flow thereof
CN103135550A (en) * 2013-01-31 2013-06-05 南京航空航天大学 Multiple obstacle-avoidance control method of unmanned plane used for electric wire inspection
CN104820429A (en) * 2015-04-28 2015-08-05 南京航空航天大学 Ultrasonic distance detection-based unmanned aerial vehicle obstacle avoidance system and control method thereof
CN105222760A (en) * 2015-10-22 2016-01-06 一飞智控(天津)科技有限公司 The autonomous obstacle detection system of a kind of unmanned plane based on binocular vision and method
CN105892489A (en) * 2016-05-24 2016-08-24 国网山东省电力公司电力科学研究院 Multi-sensor fusion-based autonomous obstacle avoidance unmanned aerial vehicle system and control method
CN107783106A (en) * 2016-08-25 2018-03-09 大连楼兰科技股份有限公司 Data fusion method between unmanned plane and barrier

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
DAVID A. HAESSIG: "《"Sense and Avoid" - What"s required for aircraft safety?》", 《SOUTHEASTCON 2016》 *
张敏: "《地面无人作战平台环境感知关键技术研究》", 《车辆与动力技术》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018166287A1 (en) * 2017-03-14 2018-09-20 北京京东尚科信息技术有限公司 Unmanned aerial vechicle positioning method and apparatus
CN109218983A (en) * 2018-06-28 2019-01-15 中国人民解放军国防科技大学 Positioning method and positioning system
CN109218983B (en) * 2018-06-28 2020-09-18 中国人民解放军国防科技大学 Positioning method and positioning system
CN109407662A (en) * 2018-08-31 2019-03-01 百度在线网络技术(北京)有限公司 Automatic driving vehicle control method and device
CN109407662B (en) * 2018-08-31 2022-10-14 百度在线网络技术(北京)有限公司 Unmanned vehicle control method and device
CN109254526A (en) * 2018-09-06 2019-01-22 南京航空航天大学 A kind of multilevel security redundancy control system hanging voluntarily transport trolley
CN108803666A (en) * 2018-09-11 2018-11-13 国网电力科学研究院武汉南瑞有限责任公司 A kind of line data-logging unmanned plane barrier-avoiding method and system based on millimetre-wave radar
CN111061279A (en) * 2020-01-03 2020-04-24 山东大学 Indoor self-adaptive cruise control system and method for electric sickbed
CN111522346A (en) * 2020-05-07 2020-08-11 国网四川省电力公司电力科学研究院 Intelligent obstacle avoidance method based on deep learning

Similar Documents

Publication Publication Date Title
CN107783106B (en) Data fusion method between unmanned aerial vehicle and barrier
CN206057974U (en) A kind of obstacle avoidance system applied on rotor wing unmanned aerial vehicle
CN107783545B (en) Obstacle avoidance system of post-disaster rescue rotor unmanned aerial vehicle based on OODA (object oriented data acquisition) ring multi-sensor information fusion
CN107783119A (en) Apply the Decision fusion method in obstacle avoidance system
Balestrieri et al. Sensors and measurements for unmanned systems: An overview
Alam et al. A survey of safe landing zone detection techniques for autonomous unmanned aerial vehicles (UAVs)
CN109029422B (en) Method and device for building three-dimensional survey map through cooperation of multiple unmanned aerial vehicles
US20190273909A1 (en) Methods and systems for selective sensor fusion
CN103135550B (en) Multiple obstacle-avoidance control method of unmanned plane used for electric wire inspection
CN103869822B (en) The perception of many rotor wing unmanned aerial vehicles and avoidance system and bypassing method thereof
CN107783548B (en) Data processing method based on multi-sensor information fusion technology
CN107783547A (en) Post disaster relief rotor wing unmanned aerial vehicle obstacle avoidance system and method
Scherer et al. Flying fast and low among obstacles
CN113597591A (en) Geographic reference for unmanned aerial vehicle navigation
CN107783544B (en) Method for controlling single-rotor plant protection unmanned aerial vehicle to avoid obstacle flight
CN109923492A (en) Flight path determines
CN107783549B (en) Single-rotor-wing plant protection unmanned aerial vehicle obstacle avoidance system based on multi-sensor information fusion technology
CN107608371A (en) Four rotor automatic obstacle avoiding unmanned plane under the environment of community in urban areas
CN109923589A (en) Building and update hypsographic map
CN104656663A (en) Vision-based UAV (unmanned aerial vehicle) formation sensing and avoidance method
CN105319969A (en) Unmanned aerial vehicle cooperative ground covering system
CN109683629A (en) Unmanned plane electric stringing system based on integrated navigation and computer vision
CN107577241A (en) A kind of fire-fighting unmanned aerial vehicle flight path planing method based on obstacle avoidance system
CN104851322A (en) Low-altitude flight target warning system and low-altitude flight target warning method based on Beidou satellite navigation system
CN116308944B (en) Emergency rescue-oriented digital battlefield actual combat control platform and architecture

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20180309

RJ01 Rejection of invention patent application after publication