EP4055343A1 - Verfahren und vorrichtung zur unterstützung der landung eines flugzeuges bei schlechten sichtverhältnissen - Google Patents

Verfahren und vorrichtung zur unterstützung der landung eines flugzeuges bei schlechten sichtverhältnissen

Info

Publication number
EP4055343A1
EP4055343A1 EP20797523.6A EP20797523A EP4055343A1 EP 4055343 A1 EP4055343 A1 EP 4055343A1 EP 20797523 A EP20797523 A EP 20797523A EP 4055343 A1 EP4055343 A1 EP 4055343A1
Authority
EP
European Patent Office
Prior art keywords
data
aircraft
landing
sensor
pilot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP20797523.6A
Other languages
English (en)
French (fr)
Inventor
Thierry Ganille
Jean-Emmanuel HAUGEARD
Pierre-Yves Dumas
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Thales SA
Original Assignee
Thales SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thales SA filed Critical Thales SA
Publication of EP4055343A1 publication Critical patent/EP4055343A1/de
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C23/00Combined instruments indicating more than one navigational value, e.g. for aircraft; Combined measuring devices for measuring two or more variables of movement, e.g. distance, speed or acceleration
    • G01C23/005Flight directors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D43/00Arrangements or adaptations of instruments
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D45/00Aircraft indicators or protectors not otherwise provided for
    • B64D45/04Landing aids; Safety measures to prevent collision with earth's surface
    • B64D45/08Landing aids; Safety measures to prevent collision with earth's surface optical
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/933Radar or analogous systems specially adapted for specific applications for anti-collision purposes of aircraft or spacecraft
    • G01S13/934Radar or analogous systems specially adapted for specific applications for anti-collision purposes of aircraft or spacecraft on airport surfaces, e.g. while taxiing
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/933Lidar systems specially adapted for specific applications for anti-collision purposes of aircraft or spacecraft
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/176Urban or other man-made structures
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0017Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information
    • G08G5/0021Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information located in the aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/02Automatic approach or landing aids, i.e. systems in which flight data of incoming planes are processed to provide landing data
    • G08G5/025Navigation or guidance aids
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Definitions

  • the invention relates to the field of landing assistance systems for aircraft based on on-board imaging cameras or sensors.
  • the invention more specifically addresses the problem of assisting the landing of aircraft in difficult weather conditions, in particular conditions of reduced or degraded visibility, in the event of fog for example.
  • a decision height (DH) is 60.96 meters (200 ft) and a decision altitude (DA) of the altitude of the runway is + 60.96 meters (+ 200 ft).
  • ILS Instrument Landing System
  • the ILS system is based on several radio frequency equipment installed on the ground, at the level of the landing strip, and a compatible instrument placed on board the aircraft.
  • the use of such a guidance system requires expensive equipment and specific pilot qualification.
  • it cannot be installed at all airports. This system is only present at the main airports because its cost makes it unacceptable to install it at the others.
  • new technologies based on satellite positioning systems will probably replace ILS systems in the future.
  • a so-called synthetic visualization solution SVS (Synthetic Vision System” in English) makes it possible to display a terrain and the landing runways from the position of the aircraft supplied by a GPS and from its attitude supplied by its inertial unit.
  • SVS synthetic Vision System
  • the uncertainty on the position of the aircraft as well as the precision of the positions of the runways which are stored in the databases prohibit the use of an SVS in critical phases where the aircraft is close to the ground such as the landing and take-off.
  • SVGS solutions Synthetic Vision with Guidance System” in English
  • adding certain controls to an SVS allow a limited reduction in landing minima (the DH decision height is reduced by 15.24 meters (50 ft) only on ILS SA CAT I approaches).
  • EVS augmented vision
  • EFVS Enhanced (Flight) Vision System
  • This solution uses electro-optical, infrared or radar sensors to film the airport environment during the landing of an aircraft.
  • the principle is to use sensors that are more efficient than the pilot's eye in degraded weather conditions, and to embed the information collected by the sensors in the pilot's field of vision, through a head-up display or on the visor of the pilot 'a helmet worn by the pilot.
  • This technique is essentially based on the use of sensors to detect the radiation from the lamps arranged along the runway and on the approach ramp.
  • Incandescent lamps produce visible light, but they also emit in the infrared range. Sensors in the infrared range make it possible to detect this radiation and the detection range is better than that of humans in the visible range, during degraded weather conditions. An improvement in visibility therefore makes it possible to a certain extent to improve the approach phases and limit abandoned approaches.
  • this technique is based on parasitic infrared radiation from the lamps present in the vicinity of the runway. For the sake of lamp durability, the current trend is to replace incandescent lamps with LED lamps. The latter have a less extensive spectrum in the infrared range. A side effect is therefore to cause technical obsolescence of EVS systems based on infrared sensors.
  • infrared sensors An alternative to infrared sensors is the obtaining of images by a radar sensor, in centimeter or millimeter band. Certain frequency bands chosen outside the peaks of water vapor absorption have very low sensitivity to severe weather conditions. Such sensors therefore make it possible to produce an image through fog, for example. However, even though these sensors have fine range resolution, they exhibit much coarser angular resolution than optical solutions. The resolution is directly related to the size of the antennas used, and it is often too coarse to obtain precise positioning of the airstrip at a sufficient distance to perform the realignment maneuvers.
  • Figure 1 illustrates a landing phase piloting symbology for a head-up display of an EVS system. The conformity of symbols depends primarily on the accuracy of the aircraft attitude data.
  • An object of the invention is to overcome the drawbacks of the known techniques by meeting the aforementioned needs by a solution for assisting the landing of aircraft, in particular an aid in visual identification for the pilot before reaching decision height. (DH) or decision altitude (DA).
  • DH decision height
  • DA decision altitude
  • a computer-implemented method for assisting aircraft landing in conditions of degraded visibility comprising at least the steps of:
  • the data reception step consists of receiving data from a sensor on board the aircraft and looking forward, said sensor being chosen from the group of FLIR piloting type sensors, multispectral camera ,
  • the step of determining data of interest consists in executing an artificial intelligence algorithm on the received sensor data, the algorithm implementing an artificial intelligence model trained for image processing, obtained during a learning phase through deep learning.
  • - deep learning is based on convolutional neural networks.
  • the step of calculating a target area consists in determining, from the characteristics of the sensor and the attitude of the aircraft corresponding to the sensor data, the heading and elevation coordinates of the target area.
  • the method comprises after the step of calculating a target area, a step of sending the coordinates of the target area to the head-up display device.
  • the coordinates of the target area correspond to two opposite corners of a rectangle surrounding the data of interest characteristic of said landing runway and / or said approach ramp, and in which the compliant symbol which is displayed is said surrounding rectangle.
  • the head-up display step consists in displaying the framing symbol on a fixed head-up screen in the cockpit and / or on a head-up screen worn by the pilot.
  • the invention also covers a computer program product comprising code instructions making it possible to perform the steps of the method for assisting in aircraft landing, in particular in conditions of degraded visibility, as claimed, when the program is run on a computer.
  • the invention further covers a device for assisting the landing of an aircraft, in particular in conditions of degraded visibility, the device comprising means for implementing the steps of the method for assisting in landing d aircraft in degraded visibility conditions according to any one of the claims.
  • the data allowing the calculation of the target area come from a first sensor, the device further comprising a second sensor capable of providing an image that can be displayed in the head-up device carried by the device. pilot, the compliant symbol calculated from the data of the first sensor being displayed on said image supplied by the second sensor.
  • Another object of the invention is a man-machine interface comprising means for displaying a compliant symbol obtained according to the claimed method.
  • Another object of the invention is a landing aid system, in particular of the SVS, SGVS, EVS, EFVS or CVS type carrying an aircraft landing aid device, in particular in conditions degraded visibility, as claimed.
  • the invention also addresses an aircraft comprising an aircraft landing aid device, in particular in conditions of degraded visibility, as claimed.
  • FIG.2 a method of assisting the landing of an aircraft making it possible to obtain a compliant symbol for a head-up display, according to one embodiment of the invention
  • FIG.3 a head-up display of an EVS system with the display of a compliant symbol obtained by the method of the invention
  • FIG.4 a display of a symbol according to the invention on an IR image
  • FIG.5 a general architecture of a display system for implementing the method of the invention.
  • FIG. 2 illustrates the steps of a method 200 for assisting the landing of an aircraft, making it possible to obtain a compliant symbol for a head-up display, according to one embodiment of the invention.
  • the method begins upon receipt 202 of sensor data, from a sensor on board an aircraft and looking forward.
  • the method of the invention applies to any type of sensor, whether it is a pilot FLIR providing an IR image, a multispectral camera, a LIDAR, or a millimeter radar.
  • the technical problem that the invention solves is that of aid in the detection of the landing strip of an aircraft by the pilot in a sensor image or in direct vision before descending below the decision height, especially in degraded visibility conditions.
  • the invention allows the display of a new compliant symbol generated from an automatic detection technique of the approach ramp or the landing runway in data from a sensor (sensor data).
  • the symbol displayed is perfectly consistent with the outside world and indicates to the pilot the zone where the runway and / or the approach ramp will appear before they are visible to the pilot with the naked eye in direct vision.
  • the method makes it possible to determine in the received sensor data, data of interest which are characteristic of the landing runway and / or of the ramp. 'approach.
  • the sensor can be an IR or multispectral sensor whose image is presented to the pilot or be a second sensor of the active type, in principle more efficient than an IR sensor, such as for example a millimeter radar, but whose data is not displayable to the pilot because they are difficult to interpret.
  • the determination of data of interest consists in implementing a conventional algorithm for detecting lines and patterns.
  • the Applicant's patent application FR3049744 describes an example of such a conventional detection algorithm.
  • the algorithm consists in calculating a bounding box of the elements of interest detected, in the form of a rectangle whose coordinates in pixels of two opposite corners correspond respectively to the smallest X coordinate and in Y is the smallest among the pixels belonging to the detected elements, and the largest X and Y coordinate among the pixels belonging to the detected elements.
  • the area of the rectangle can be increased by a few percent, for example 10%, while remaining centered on the initial rectangle.
  • the step of determining data of interest consists in executing an artificial intelligence algorithm on the received sensor data, the algorithm implementing an artificial intelligence model for processing 'images trained for landing runway and approach ramp detection.
  • the trained model is a model on board the aircraft for operational use, which was obtained during a learning phase, and in particular by deep learning by artificial neural network for runway and ramp detection.
  • the artificial neural network is a convolutional neural network (CNN for “Convolutional Neural Network”).
  • a classical model based on CNN can be set up for the detection and segmentation of track and ramp, for example by using a mask-RCNN (Regions with CNN features) - resNet 101 (101 layers) architecture ) [Mask R-CNN - Kaiming et al. 2017] From this model, a transfer learning (then finer learning) can be carried out to adapt to the track and ramp use case.
  • a mask-RCNN Regions with CNN features
  • resNet 101 101 layers
  • the goal of deep learning is to model with a high level of data abstraction.
  • the learning phase defines and generates a trained AI model that meets the operational need. This model is then used in the operational context during the inference phase.
  • the learning phase is therefore essential. Learning is considered efficient if it allows a predictive model to be defined that adapts well to learning data but is also able to predict well on data that was not seen during learning. If the model does not fit the training data, the model suffers from under-training. If the model adapts too well to the training data and is not able to generalize, the model suffers from overfitting.
  • the learning phase requires having collected a large database that is the most representative of the operational context and having labeled them with regard to ground truth (VT).
  • VT ground truth
  • ground truth in English, is a reference image which represents an expected result after a segmentation operation.
  • the ground truth of an image represents at least one track and an approach ramp as well as the visible ground.
  • the result of a segmentation of an image is compared with the reference image or ground truth in order to evaluate the performance of the classification algorithm.
  • the learning phase makes it possible to define the architecture of the neural network and the associated hyper-parameters (the number of layers, the types of layers, the learning step, etc. .), then to seek by successive iteration, the best parameters (the weightings of the layers and between the layers) which best model the different labels (track / ramp).
  • the neural network propagates (extract / abstract characteristics specific to the objects of interest) and estimates the presence and position of the objects. From this estimate and the ground truth, the learning algorithm calculates a prediction error and backpropagation in the network in order to update the model parameters.
  • a training database must contain a very large number of data representing a maximum of possible situations, including for the context of the invention, different approaches on different tracks with different light ramps of approaches for different weather conditions .
  • the database which is constituted contains a plurality of labeled or labeled datasets, where each set of labeled data corresponds to a pair (sensor data, ground truth VT).
  • a VT ground truth for the operational context of the present invention is a description of various elements of interest to be recognized in the sensor data, including at least one runway and one approach ramp.
  • the method makes it possible to calculate an area in which the approach light strip and / or the landing runway have been detected.
  • the target area is calculated as being a rectangle surrounding the identified elements of interest. From the characteristics of the sensor and the attitude of the aircraft corresponding to the sensor data, the method allows to calculate the coordinates in heading and in elevation of two opposite corners of a surrounding rectangle.
  • the coordinates are sent to a head-up display device (or head-down on a head-down EVS or CVS device), carried or not, and the area is displayed in a step following (208) as a symbol conforming to the outside world, in the form of a surrounding rectangle.
  • Figure 3 illustrates a head-up display of an EVS system with the display of a conforming symbol (302) obtained by the method of the invention.
  • the display of the symbol makes it possible to validate the trajectory towards the landing strip before the visual acquisition thereof by the pilot.
  • the display of the symbol thus helps the pilot in acquiring the mandatory visual references before the DH since he knows he must search inside the rectangle.
  • the head-up device can also display an SVS, an EVS or a CVS.
  • the sensor image displayed is the one that fed the search for I ⁇ A.
  • the method makes it possible to display a head-up IR image in which the pilot searches for his visual references aided by a framing symbol (402), for example a rectangle originating from the detection of the landing strip by the CNN model or by any other runway and ramp detection algorithm, on data from an active sensor, for example a millimeter radar.
  • a framing symbol for example a rectangle originating from the detection of the landing strip by the CNN model or by any other runway and ramp detection algorithm
  • an active sensor for example a millimeter radar.
  • the aircraft benefits from the lowering of the EFVS landing minima.
  • the onboard sensor is a simple visible camera and its image is not presented to the pilot. Only the compliant symbol resulting from the method of the invention is presented to the pilot then providing him with an aid in the visual detection of the runway, for example during visual flights (VFR for “View Flight Rules” in English) but with reduced visibility. .
  • VFR Visual Flight Rules
  • FIG. 5 illustrates a general architecture of a display system 500 making it possible to implement the method of the invention.
  • a validated AI model (architecture and the learned hyper-parameters) is integrated into a system on board an aircraft which comprises at least one sensor of the same type as that used for the learning.
  • the on-board system 500 also comprises a field database (BDT) 502, a database of elements of interest (BDEI) 504, a module for generating a synthetic view 506 in 3D towards the front of the aircraft (SVS) from the position and attitude of the aircraft received by sensors 508, and sensors 510, an analysis module 512 comprising at least one validated AI model, and a device for SVS display 514 for the aircraft crew.
  • the display device 514 may be a head-down display (HDD), a transparent head-up display (HUD), a transparent head-mounted display (HWD), the windshield of the 'aircraft.
  • HDD head-down display
  • HUD transparent head-up display
  • HWD transparent head-mounted display
  • the usual piloting symbology presenting the piloting parameters of the aircraft is superimposed on the synthetic 3D view.
  • the analysis module 512 can be configured to correct the airstrip position shown on the SVS 506.
  • the invention can be implemented from hardware and / or software elements.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Evolutionary Computation (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Traffic Control Systems (AREA)
EP20797523.6A 2019-11-07 2020-11-03 Verfahren und vorrichtung zur unterstützung der landung eines flugzeuges bei schlechten sichtverhältnissen Pending EP4055343A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1912486A FR3103050B1 (fr) 2019-11-07 2019-11-07 Procede et dispositif d'aide a l'atterrissage d'aeronef en conditions de visibilite degradee
PCT/EP2020/080807 WO2021089539A1 (fr) 2019-11-07 2020-11-03 Procede et dispositif d'aide a l'atterrissage d'aeronef en conditions de visibilite degradee

Publications (1)

Publication Number Publication Date
EP4055343A1 true EP4055343A1 (de) 2022-09-14

Family

ID=70154467

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20797523.6A Pending EP4055343A1 (de) 2019-11-07 2020-11-03 Verfahren und vorrichtung zur unterstützung der landung eines flugzeuges bei schlechten sichtverhältnissen

Country Status (4)

Country Link
US (1) US20220373357A1 (de)
EP (1) EP4055343A1 (de)
FR (1) FR3103050B1 (de)
WO (1) WO2021089539A1 (de)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11866194B2 (en) * 2021-10-30 2024-01-09 Beta Air, Llc Systems and methods for a visual system for an electric aircraft
US20240101273A1 (en) * 2022-09-26 2024-03-28 Rockwell Collins, Inc. Pilot alerting of detected runway environment

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8264498B1 (en) * 2008-04-01 2012-09-11 Rockwell Collins, Inc. System, apparatus, and method for presenting a monochrome image of terrain on a head-up display unit
FR2947083B1 (fr) * 2009-06-23 2011-11-11 Thales Sa Dispositif et procede d'aide a l'atterrissage
US20130041529A1 (en) * 2011-08-11 2013-02-14 Honeywell International Inc. Aircraft vision system having redundancy for low altitude approaches
US9280904B2 (en) * 2013-03-15 2016-03-08 Airbus Operations (S.A.S.) Methods, systems and computer readable media for arming aircraft runway approach guidance modes
CA2995850A1 (en) * 2015-08-31 2017-03-09 Ryan Kottenstette Systems and methods for analyzing remote sensing imagery
FR3049744B1 (fr) 2016-04-01 2018-03-30 Thales Procede de representation synthetique d'elements d'interet dans un systeme de visualisation pour aeronef
FR3058233B1 (fr) * 2016-11-03 2018-11-16 Thales Procede de superposition d'une image issue d'un capteur sur une image synthetique par la detection automatique de la limite de visibilite et systeme de visualision associe

Also Published As

Publication number Publication date
FR3103050B1 (fr) 2021-11-26
WO2021089539A1 (fr) 2021-05-14
US20220373357A1 (en) 2022-11-24
FR3103050A1 (fr) 2021-05-14

Similar Documents

Publication Publication Date Title
EP0678841B1 (de) Landungshilfevorrichtung
EP3657213B1 (de) Lernverfahren eines neuronennetzes an bord eines luftfahrzeugs für die landehilfe dieses luftfahrzeugs, und server für die umsetzung eines solchen verfahrens
US20210158157A1 (en) Artificial neural network learning method and device for aircraft landing assistance
US20130004017A1 (en) Context-Based Target Recognition
EP3226062B1 (de) Verfahren zur synthetischen darstellung von wichtigen elementen in einem visualisierungssystem eines luftfahrzeugs
CN105158762A (zh) 识别和跟踪对流天气单体
FR3103048A1 (fr) Procede et dispositif de generation de donnees synthetiques d'apprentissage pour machine d'intelligence artificielle pour l'aide a l'atterrissage d'aeronef
EP1870789A1 (de) System zur Erfassung von Hindernissen in der Umgebung eines Landungspunktes
FR2908218A1 (fr) Dispositif d'aide a la navigation d'un aeronef dans une zone aeroportuaire
FR3003989A1 (fr) Procede pour localiser et guider un vehicule par voie optique par rapport a un aeroport
Nagarani et al. Unmanned Aerial vehicle’s runway landing system with efficient target detection by using morphological fusion for military surveillance system
FR2736149A1 (fr) Dispositif de reconnaissance et de poursuite d'objets
US20220128701A1 (en) Systems and methods for camera-lidar fused object detection with lidar-to-image detection matching
EP4055343A1 (de) Verfahren und vorrichtung zur unterstützung der landung eines flugzeuges bei schlechten sichtverhältnissen
FR3054357A1 (fr) Procede et dispositif de determination de la position d'un aeronef lors d'une approche en vue d'un atterrissage
FR3077393A1 (fr) Véhicules aériens à vision artificielle
EP3656681A1 (de) Vorrichtung und verfahren zur landungsunterstützung eines luftfahrzeugs bei eingeschränkten sichtbedingungen
WO2021089536A1 (fr) Procede et dispositif de generation de donnees d'apprentissage pour machine d'intelligence artificielle pour l'aide a l'atterrissage d'aeronef
EP2150776A1 (de) Anzeigevorrichtung für ein flugzeug mit mitteln zur anzeige einer navigationssymbologie zur vermeidung von hindernissen
FR2808588A1 (fr) Procede et dispositif de determination de la position d'un vehicule par rapport a une trajectoire ideale
Bharti et al. Neural Network Based Landing Assist Using Remote Sensing Data
US20220309786A1 (en) Method for training a supervised artificial intelligence intended to identify a predetermined object in the environment of an aircraft
EP3578926B1 (de) Sicherungsverfahren zur verarbeitung eines synthetischen flugzeug-visionssystems, zugehöriges system und computerprogrammprodukt
US20220128700A1 (en) Systems and methods for camera-lidar fused object detection with point pruning
Liu et al. Runway detection during approach and landing based on image fusion

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20220505

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230427

17Q First examination report despatched

Effective date: 20230602