CN104833338A - Visual-based airplane landing assistant device - Google Patents

Visual-based airplane landing assistant device Download PDF

Info

Publication number
CN104833338A
CN104833338A CN201510095986.XA CN201510095986A CN104833338A CN 104833338 A CN104833338 A CN 104833338A CN 201510095986 A CN201510095986 A CN 201510095986A CN 104833338 A CN104833338 A CN 104833338A
Authority
CN
China
Prior art keywords
runway
angle
image
further characterized
aircraft
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510095986.XA
Other languages
Chinese (zh)
Inventor
张国飙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Haicun Information Technology Co Ltd
Original Assignee
Hangzhou Haicun Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Haicun Information Technology Co Ltd filed Critical Hangzhou Haicun Information Technology Co Ltd
Priority claimed from CN201310247045.4A external-priority patent/CN104006790A/en
Publication of CN104833338A publication Critical patent/CN104833338A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C5/00Measuring height; Measuring distances transverse to line of sight; Levelling between separated points; Surveyors' levels
    • G01C5/005Measuring height; Measuring distances transverse to line of sight; Levelling between separated points; Surveyors' levels altimeters for aircraft

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides a visual-based airplane landing assistant device which obtains a series of runway images, corrects a heeling angle ([gamma]) to the original runway images, and finally calculates the altitude (A) of an airplane according to the [gamma]-corrected runway images and the width (W) of the runway. The device may include: a sensor for measuring at least one attitude angle and accelerating image processing therewith. A smart phone is quite suitable for being used as the airplane landing assistant device.

Description

The aircraft landing servicing unit of view-based access control model
Technical field
The present invention relates to aviation field, or rather, relate to the landing servicing unit of aircraft.
Background technology
Landing is the challenging part of most in-flight.When aircraft enters ground effect region, pilot by head pull-up, to reduce the decline rate of aircraft.This operation is called evens up, and starts to even up the moment of operation and is highly called and evens up moment and flare out altitude.For baby plane, flare out altitude to be generally on ground within 5m to 10m.Due to flying cadet's more difficult judgement flare out altitude usually, they need time landing of exercise hundreds of to grasp flare out altitude.So a large amount of landing exercises adds the training time, wastes a large amount of fuel, and has negative effect to environment.Although radar altimeter or laser ceilometer can with helping even up, they costly.The landing servicing unit of the most handy low cost is helped flying cadet and is grasped landing skill.
Conventional art also adopts computer vision to carry out second-mission aircraft landing.United States Patent (USP) 8,315,748(inventor: Lee, authorizes day: on November 20th, 2012) propose a kind of height measurement method of view-based access control model.It uses a kind of circle marker as reference substance during VTOL aircraft (VTOL) vertical takeoff and landing.First camera in aircraft obtains the image of circle marker, then the horizontal diameter of this Circle in Digital Images shape mark and vertical diameter is measured, last aircraft altitude can pass through the actual diameter of these diameter data, circle marker, the distance between circle marker and takeoff and landing point, and the course attitude of aircraft (i.e. course angle, the angle of pitch and angle of heel) is calculated.For fixed wing aircraft, circle marker and the distance of aircraft between floor projection point change, and therefore this method is inapplicable.
Summary of the invention
Fundamental purpose of the present invention is to provide a kind of aircraft landing servicing unit of low cost.
Another object of the present invention helps flying cadet to grasp landing skill.
Another object of the present invention is economize energy resource, and improves environmental quality.
To achieve these goals, the present invention proposes a kind of aircraft landing servicing unit of view-based access control model.It is made up of a camera and a processor.Camera is arranged on aircraft tip, obtains a series of original runway image towards runway.Processor is from original runway image zooming-out angle of heel γ.After obtaining γ, original runway image is rotated-γ to carry out γ correction around its optics initial point, the local horizon of the runway image (correcting runway image) after correction becomes level (if can see its horizontal words).Image procossing after this all carries out in correction runway image.It is called as principal horizontal line H by the horizontal line of optics initial point, is called as main perpendicular line V by the perpendicular line of optics initial point.Meanwhile, the intersection point of runway left and right edges extended line is labeled as P, its coordinate X p(i.e. the distance of intersection point P and principal horizontal line H) can be used for calculating angle of pitch ρ=atan (X p/ f), its coordinate Y p(i.e. the distance of intersection point P and main perpendicular line V) can be used for calculating course angle α=atan [(Y p/ f) * cos (ρ)].Here, f is the focal length of camera.Finally, the distance, delta of runway left and right edges extended line and principal horizontal line H intersection point A, B can be used for calculating height A=W*sin (ρ)/cos (α)/(Δ/f) of aircraft, and wherein W is runway width.In addition, the angle theta between runway left and right edges extended line and principal horizontal line H aand θ balso can be used for calculating A=W*cos (ρ)/cos (α)/[cot (θ a)-cot (θ b)].
Aircraft landing servicing unit can also comprise a sensor, as an inertial sensor (as gyroscope) or a magnetic field sensor (as magnetic field instrument).It can be used for measuring attitude angle (as angle of pitch ρ, course angle α, angle of heel γ).The attitude angle of direct employing sensor measurement can simplify high computational.Such as say, the angle of heel γ of measurement can directly be used for rotating original runway image; The angle of pitch ρ measured and course angle α can between be used for computed altitude.Use sensing data can reduce the workload of processor, accelerate image procossing.
The elevation carrection of view-based access control model is especially suitable as application software (app) and is arranged on smart mobile phone.Smart mobile phone contains the parts (comprising camera, sensor and processor) needed for this elevation carrection all.Because smart mobile phone is ubiquitous, the aircraft landing servicing unit of view-based access control model does not need to increase hardware, only need install " landing is an auxiliary " app on smart mobile phone.This aircraft landing servicing unit based on software has least cost.
Correspondingly, the present invention proposes a kind of aircraft landing servicing unit of view-based access control model, comprising: an elementary area, and this elementary area obtains at least one original runway image; One processing unit, this processing unit measures the characteristic of runway left and right edges extended line runway image from correcting, and calculating aircraft altitude (A) according to described characteristic and runway width (W), this correction runway image is rotated by this original runway image and obtains.
Accompanying drawing explanation
Fig. 1 shows the relative position of an airplane and a runway.
Fig. 2 A-Fig. 2 C is the functional block diagram of the aircraft landing servicing unit of three view-based access control model.
Fig. 3 illustrates the definition of angle of heel (γ).
Fig. 4 is an original runway image.
Fig. 5 is a correction runway image.
Fig. 6 illustrates the definition of the angle of pitch (ρ).
Fig. 7 illustrates the definition of course angle (α).
Fig. 8 represents a kind of height measurement method of view-based access control model.
Fig. 9 A-Fig. 9 B is the aircraft landing servicing unit with orientating function.
Notice, these accompanying drawings are only synoptic diagrams, and their not to scale (NTS) are drawn.In order to obvious and conveniently, the portion size in figure and structure may zoom in or out.In different embodiments, identical symbol generally represents corresponding or similar structure.
Embodiment
In the embodiment in figure 1, aircraft 10 is equipped with the landing servicing unit 20 of a view-based access control model.This device 20 is arranged on after aircraft 10 windshield, face forward.It can be camera, the computing machine being with camera function or class computer installation or smart mobile phone.Its optics initial point is labeled as O '.Landing servicing unit 20 utilize computer vision measurement it to the height A on ground 0.Runway 100 to be positioned on ground 0 and to be in aircraft forward.Its length is L, and width is W.Herein, ground coordinate is defined as: its initial point o is the projection of O ' on ground 0, and its x-axis is parallel to the longitudinal axis (landing airdrome length direction) of runway 100, and y-axis is parallel to the transverse axis (runway Width) of runway, and z-axis is perpendicular to x-y plane.Z-axis is defined by runway surface separately, and it is shared by coordinates many in this instructions.
Fig. 2 A-Fig. 2 C represents the aircraft landing servicing unit 20 of three kinds of view-based access control model.Embodiment in Fig. 2 A contains a camera 30 and a processor 70.The runway image that it utilizes runway width W and camera 30 to obtain carrys out computed altitude A.User can obtain runway width W from Airport information table (Airport Directory), and manually inputs; Landing servicing unit 20 also directly can obtain runway width W from airport database by electronic retrieval.This aircraft landing servicing unit 20 can measuring height, the following height of prediction aircraft, and for pilot provides instruction (as vision and/or sound indicate) before decision point.Such as, first two seconds of aircraft landing operation (as evening up or pre-landing maneuver), two short serge sound and a long serge sound is sent.Pilot should be ready when front short twice serge sound, operates when last long serge sound.
Compared with Fig. 2 A, the embodiment in Fig. 2 B also comprises a sensor 40, as an inertial sensor (as gyroscope) or a magnetic field sensor (as magnetic field instrument).It can be used for measuring attitude angle (as angle of pitch ρ, course angle α, angle of heel γ).The attitude angle of direct employing sensor measurement can simplify high computational.Such as say, the angle of heel γ of measurement can directly be used for rotating original runway image; The angle of pitch ρ measured and course angle α can between be used for computed altitude (see Fig. 8).Use sensing data can reduce the workload of processor, accelerate image procossing.
Embodiment in Fig. 2 C is a smart mobile phone 80.It also comprises a storer 50, and this storer 50 stores " aircraft landing " application software (app) 60.By running " aircraft landing " app 60, smart mobile phone 80 can measuring height, the following height of prediction aircraft, and for pilot provides instruction before decision point.Smart mobile phone contains the parts (comprising camera, sensor and processor) needed for all elevation carrection, and it can easily second-mission aircraft landing.Because smart mobile phone is ubiquitous, the aircraft landing servicing unit of view-based access control model does not need to increase hardware, only need install " landing is an auxiliary " app on smart mobile phone.This aircraft landing servicing unit based on software has least cost.
Fig. 3-Fig. 5 describes a kind of method obtaining angle of heel (γ).Fig. 3 defines the angle of heel (γ) of camera 30.Because the imageing sensor 32(of camera 30 is as ccd sensor or cmos sensor) be rectangle in the plane of delineation 36, coordinates of original image coordinates XYZ can be defined as follows:, initial point O is the optics initial point of imageing sensor 32, X, Y-axis are rectangular two center lines, and Z axis is perpendicular to X-Y plane.Here straight line N is simultaneously perpendicular to z and Z axis, and is always parallel to runway plane.Angle of heel (γ) is defined as the angle between Y-axis and straight line N.Image coordinate XYZ forms image coordinate (correcting image coordinate) X*Y*Z* after correcting after Z axis rotation-γ.Here, straight line N is also the Y* axle of correcting image coordinate.
Fig. 4 is the original runway image 100i that camera 30 obtains.Because camera 30 has angle of heel γ, horizontal image 120i tilts, and the angle between it and Y-axis is γ.By image 100i around initial point O rotation-γ, γ correction can be carried out to it.Fig. 5 is runway image (the correcting runway image) 100* after γ corrects, and its local horizon 120* level, is namely parallel to Y* axle.In correction runway image 100*, be called as principal horizontal line H by the horizontal line (i.e. Y* axle) of its optics initial point O, be called as main perpendicular line V by the perpendicular line (i.e. X* axle) of its optics initial point O.Fig. 6-Fig. 8 will be further analyzed correction runway image 100*.
Fig. 6 defines the angle of pitch (ρ) of camera 30.Optical coordinate X ' Y ' Z ' is formed along Z* axle translation distance f for correcting image coordinate X*Y*Z*.Here, f is the focal length of lens 38.Here also define ground coordinate (correction ground coordinate) x*y*z* that a α corrects rear (see Fig. 7), its initial point o* is identical with ground coordinate xyz with z* axle, and x* axle and X ' axle are in same level.The optics initial point O ' of lens is height A to the distance of ground (i.e. initial point o*).The angle that the angle of pitch (ρ) is Z ' axle and x* axle.For one on ground 0, coordinate is (x*, y*, 0) some R(is in correction ground coordinate x*y*z*), coordinate (the X* of its image formed on imageing sensor 32, Y*, 0) (in correcting image coordinate X*Y*Z*) can be expressed as: δ=ρ – atan (A/x*), X*=-f*tan (δ), Y*=f*y*/sqrt (x*^2+A^2)/cos (δ).
Fig. 7 defines the course angle (α) of camera 30.The figure illustrates ground coordinate xyz and correct ground coordinate x*y*z*.α is have rotated along z-axis between them.Notice that α defines relative to the longitudinal axis (length direction) of runway 100.Although x-axis is parallel to the longitudinal axis of runway 100, adopt and correct ground coordinate x*y*z* computationally more efficiently, therefore this instructions analyzes runway image in this coordinate.
Fig. 8 illustrates a kind of step of elevation carrection.First, from the local horizon 120i of original runway image, angle of heel γ (Fig. 4, step 210) is extracted.Acquisition γ after, by original runway image around optics initial point Xuan Zhuan – γ to carry out γ correction (Fig. 5, step 220).In correction runway image 100*, the intersection point of runway left and right edges extended line 160*, 180* is labeled as P, its coordinate (X p, Y p) (X pfor the distance between intersection point P and principal horizontal line H; Y pdistance between intersection point P and main perpendicular line V) can be expressed as respectively: X p=f*tan (ρ), Y p=f*tan (α)/cos (ρ), can calculate angle of pitch ρ=atan (X thus p/ f) (Fig. 5, step 230), and course angle α=atan [(Y p/ f) * cos (ρ)] (Fig. 5, step 240).
Finally, measure the distance, delta of intersection point A and B between runway left and right edges extended line 160*, 180* and principal horizontal line H, (Fig. 5, step 250), and calculate aircraft altitude A=W*sin (ρ)/cos (α)/(Δ/f) with this.In addition, the angle theta between runway left and right edges extended line and principal horizontal line H aand θ balso can be used for calculating A=W*cos (ρ)/cos (α)/[cot (θ a)-cot (θ b)].
To those skilled in the art, the step in Fig. 8 can be skipped or reversed order.Such as, when sensor 40 is for measuring at least one attitude angle (as angle of pitch ρ, course angle α, angle of heel γ), the angle of heel γ of measurement can directly be used for rotating original runway image (skipping step 210); The angle of pitch ρ measured and course angle α can between be used for computed altitude (skipping step 230,240).Use sensing data can reduce the workload of processor, accelerate image procossing.
Fig. 9 A-Fig. 9 B is the aircraft landing servicing unit 20 with orientating function.It ensures the local horizon level all the time in runway image, thus does not need to carry out γ correction to runway image, and this can simplify high computational.Specifically, aircraft landing servicing unit (as mobile phone) 20 is placed in an orientor 19.This orientor 19 is made up of cradle 18, pouring weight (weight) 14 and Mobile phone base 12.Support 17 is fixed on aircraft 10, and cradle 18 is supported on support 17 by ball bearing 16.No matter aircraft 10 is that (Fig. 9 A) still has an angle of pitch ρ (Fig. 9 B) in the horizontal direction, and pouring weight 14 can ensure that the longitudinal axis of mobile phone 20 is always along the direction of gravity z.Pouring weight 14, preferably containing metal material, to form a pair damper with magnet 15, thus helps to stablize cradle 18.
Should understand, not away under the prerequisite of the spirit and scope of the present invention, can change form of the present invention and details, this does not hinder them to apply spirit of the present invention.Such as say, the embodiment in the present invention is all applied in fixed wing aircraft, and it also can be used in rotary wing aircraft (as helicopter) or unmanned vehicle (UAV).Therefore, except the spirit according to additional claims, the present invention should not be subject to any restriction.

Claims (10)

1. an aircraft landing servicing unit for view-based access control model, is characterized in that containing:
One elementary area, this elementary area obtains at least one original runway image;
One processing unit, this processing unit measures the characteristic of runway left and right edges extended line runway image from correcting, and calculating aircraft altitude (A) according to described characteristic and runway width (W), this correction runway image is rotated by this original runway image and obtains.
2. device according to claim 1, be further characterized in that: described characteristic comprises the distance (Δ) between runway left and right edges extended line and principal horizontal line intersection point, and height (A) by following formulae discovery A=W*sin (ρ)/cos (α)/(Δ/ f), wherein, fbe the focal length of these elementary area lens, ρ is the angle of pitch, and α is course angle.
3. device according to claim 1, is further characterized in that: described characteristic comprises the angle (θ between runway left and right edges extended line and principal horizontal line intersection point a, θ b), and height (A) is by following formulae discovery A=W*cos (ρ)/cos (α)/[cot (θ a)-cot (θ b)], wherein, ρ is the angle of pitch, and α is course angle.
4. device according to claim 1, is further characterized in that: described characteristic comprises the distance (X between the principal horizontal line of runway image after the intersection point of runway left and right edges extended line and this rotation p), and the angle of pitch (ρ) is by following formulae discovery ρ=atan (X p/ f), wherein, fit is the focal length of these elementary area lens.
5. device according to claim 1, is further characterized in that: described characteristic comprise the main perpendicular line of runway image after the intersection point of runway left and right edges extended line and this rotation spacing (Y p), and course angle (α) is by following formulae discovery α=atan [(Y p/ f) * cos (ρ)], wherein, fbe the focal length of lens of this elementary area, ρ is the angle of pitch.
6. device according to claim 1, is further characterized in that: when this original runway image horizon is out-of-level, described processing unit rotates this original runway image and obtains this correction runway image, the local horizon level of this correction runway image.
7. device according to claim 1, is further characterized in that and contains: a sensor, at least one attitude angle of this sensor measurement.
8. device according to claim 1, is further characterized in that and contains: a directed element, and this directed element ensures that this original runway image is without angle of heel.
9. device according to claim 1, is further characterized in that: this aircraft is a fixed wing aircraft, rotary wing aircraft or unmanned vehicle.
10. device according to claim 1, is further characterized in that: this device is a smart mobile phone.
CN201510095986.XA 2013-06-21 2013-06-21 Visual-based airplane landing assistant device Pending CN104833338A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310247045.4A CN104006790A (en) 2013-02-21 2013-06-21 Vision-Based Aircraft Landing Aid

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201310247045.4A Division CN104006790A (en) 2013-02-21 2013-06-21 Vision-Based Aircraft Landing Aid

Publications (1)

Publication Number Publication Date
CN104833338A true CN104833338A (en) 2015-08-12

Family

ID=53835828

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510095986.XA Pending CN104833338A (en) 2013-06-21 2013-06-21 Visual-based airplane landing assistant device

Country Status (1)

Country Link
CN (1) CN104833338A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113295164A (en) * 2021-04-23 2021-08-24 四川腾盾科技有限公司 Unmanned aerial vehicle visual positioning method and device based on airport runway
CN114493207A (en) * 2022-01-14 2022-05-13 山东航空股份有限公司 Quick takeoff and landing runway identification method and system based on QAR data

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113295164A (en) * 2021-04-23 2021-08-24 四川腾盾科技有限公司 Unmanned aerial vehicle visual positioning method and device based on airport runway
CN113295164B (en) * 2021-04-23 2022-11-04 四川腾盾科技有限公司 Unmanned aerial vehicle visual positioning method and device based on airport runway
CN114493207A (en) * 2022-01-14 2022-05-13 山东航空股份有限公司 Quick takeoff and landing runway identification method and system based on QAR data

Similar Documents

Publication Publication Date Title
CN104006790A (en) Vision-Based Aircraft Landing Aid
KR102003152B1 (en) Information processing method, device, and terminal
CN111448476B (en) Technique for sharing mapping data between unmanned aerial vehicle and ground vehicle
CN106774431B (en) Method and device for planning air route of surveying and mapping unmanned aerial vehicle
US11884418B2 (en) Control device, control method, and flight vehicle device
CN106708066B (en) View-based access control model/inertial navigation unmanned plane independent landing method
US11604479B2 (en) Methods and system for vision-based landing
US20200344464A1 (en) Systems and Methods for Improving Performance of a Robotic Vehicle by Managing On-board Camera Defects
US20190096069A1 (en) Systems and methods for visual target tracking
Bayard et al. Vision-based navigation for the NASA mars helicopter
CN105974940B (en) Method for tracking target suitable for aircraft
Cocchioni et al. Autonomous navigation, landing and recharge of a quadrotor using artificial vision
CN106124517A (en) Detect many rotor wing unmanned aerial vehicles detection platform system in structural member surface crack and for the method detecting structural member surface crack
CN109753076A (en) A kind of unmanned plane vision tracing implementing method
US10703508B1 (en) Stereoscopic flight simulator with data acquisition
JP6430073B2 (en) Attitude estimation apparatus, attitude estimation method, and observation system
CN104007766A (en) Flight control method and device for unmanned aerial vehicle
CN102840852A (en) Aerial photograph image pickup method and aerial photograph image pickup apparatus
CN111670339A (en) Techniques for collaborative mapping between unmanned aerial vehicles and ground vehicles
CN111415409A (en) Modeling method, system, equipment and storage medium based on oblique photography
CN104061904A (en) Method for rapidly and accurately determining shape and area of forest gap
CN109341686A (en) A kind of tightly coupled aircraft lands position and orientation estimation method of view-based access control model-inertia
CN108225273A (en) A kind of real-time runway detection method based on sensor priori
US20210229810A1 (en) Information processing device, flight control method, and flight control system
CN104833338A (en) Visual-based airplane landing assistant device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20150812