EP4055343A1 - Method and device for assisting in landing an aircraft under poor visibility conditions - Google Patents
Method and device for assisting in landing an aircraft under poor visibility conditionsInfo
- Publication number
- EP4055343A1 EP4055343A1 EP20797523.6A EP20797523A EP4055343A1 EP 4055343 A1 EP4055343 A1 EP 4055343A1 EP 20797523 A EP20797523 A EP 20797523A EP 4055343 A1 EP4055343 A1 EP 4055343A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- data
- aircraft
- landing
- sensor
- pilot
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 43
- 238000013459 approach Methods 0.000 claims abstract description 32
- 230000000007 visual effect Effects 0.000 claims abstract description 14
- 238000004422 calculation algorithm Methods 0.000 claims description 11
- 238000013473 artificial intelligence Methods 0.000 claims description 6
- 238000013527 convolutional neural network Methods 0.000 claims description 6
- 238000013135 deep learning Methods 0.000 claims description 6
- 238000004590 computer program Methods 0.000 claims description 3
- 238000009432 framing Methods 0.000 claims description 3
- 238000012545 processing Methods 0.000 claims description 3
- 238000004364 calculation method Methods 0.000 claims description 2
- 238000001514 detection method Methods 0.000 description 13
- 238000013528 artificial neural network Methods 0.000 description 5
- 238000012549 training Methods 0.000 description 5
- 230000008901 benefit Effects 0.000 description 3
- 230000005855 radiation Effects 0.000 description 3
- 230000011218 segmentation Effects 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 238000012800 visualization Methods 0.000 description 2
- 238000010521 absorption reaction Methods 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 238000007635 classification algorithm Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000010006 flight Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012907 on board imaging Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000003071 parasitic effect Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000012797 qualification Methods 0.000 description 1
- 230000001105 regulatory effect Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 238000013526 transfer learning Methods 0.000 description 1
- 238000009966 trimming Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Chemical compound O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C23/00—Combined instruments indicating more than one navigational value, e.g. for aircraft; Combined measuring devices for measuring two or more variables of movement, e.g. distance, speed or acceleration
- G01C23/005—Flight directors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D43/00—Arrangements or adaptations of instruments
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D45/00—Aircraft indicators or protectors not otherwise provided for
- B64D45/04—Landing aids; Safety measures to prevent collision with earth's surface
- B64D45/08—Landing aids; Safety measures to prevent collision with earth's surface optical
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/89—Radar or analogous systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/933—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of aircraft or spacecraft
- G01S13/934—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of aircraft or spacecraft on airport surfaces, e.g. while taxiing
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/933—Lidar systems specially adapted for specific applications for anti-collision purposes of aircraft or spacecraft
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/17—Terrestrial scenes taken from planes or by drones
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/176—Urban or other man-made structures
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0017—Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information
- G08G5/0021—Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information located in the aircraft
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/02—Automatic approach or landing aids, i.e. systems in which flight data of incoming planes are processed to provide landing data
- G08G5/025—Navigation or guidance aids
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10044—Radar image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
Definitions
- the invention relates to the field of landing assistance systems for aircraft based on on-board imaging cameras or sensors.
- the invention more specifically addresses the problem of assisting the landing of aircraft in difficult weather conditions, in particular conditions of reduced or degraded visibility, in the event of fog for example.
- a decision height (DH) is 60.96 meters (200 ft) and a decision altitude (DA) of the altitude of the runway is + 60.96 meters (+ 200 ft).
- ILS Instrument Landing System
- the ILS system is based on several radio frequency equipment installed on the ground, at the level of the landing strip, and a compatible instrument placed on board the aircraft.
- the use of such a guidance system requires expensive equipment and specific pilot qualification.
- it cannot be installed at all airports. This system is only present at the main airports because its cost makes it unacceptable to install it at the others.
- new technologies based on satellite positioning systems will probably replace ILS systems in the future.
- a so-called synthetic visualization solution SVS (Synthetic Vision System” in English) makes it possible to display a terrain and the landing runways from the position of the aircraft supplied by a GPS and from its attitude supplied by its inertial unit.
- SVS synthetic Vision System
- the uncertainty on the position of the aircraft as well as the precision of the positions of the runways which are stored in the databases prohibit the use of an SVS in critical phases where the aircraft is close to the ground such as the landing and take-off.
- SVGS solutions Synthetic Vision with Guidance System” in English
- adding certain controls to an SVS allow a limited reduction in landing minima (the DH decision height is reduced by 15.24 meters (50 ft) only on ILS SA CAT I approaches).
- EVS augmented vision
- EFVS Enhanced (Flight) Vision System
- This solution uses electro-optical, infrared or radar sensors to film the airport environment during the landing of an aircraft.
- the principle is to use sensors that are more efficient than the pilot's eye in degraded weather conditions, and to embed the information collected by the sensors in the pilot's field of vision, through a head-up display or on the visor of the pilot 'a helmet worn by the pilot.
- This technique is essentially based on the use of sensors to detect the radiation from the lamps arranged along the runway and on the approach ramp.
- Incandescent lamps produce visible light, but they also emit in the infrared range. Sensors in the infrared range make it possible to detect this radiation and the detection range is better than that of humans in the visible range, during degraded weather conditions. An improvement in visibility therefore makes it possible to a certain extent to improve the approach phases and limit abandoned approaches.
- this technique is based on parasitic infrared radiation from the lamps present in the vicinity of the runway. For the sake of lamp durability, the current trend is to replace incandescent lamps with LED lamps. The latter have a less extensive spectrum in the infrared range. A side effect is therefore to cause technical obsolescence of EVS systems based on infrared sensors.
- infrared sensors An alternative to infrared sensors is the obtaining of images by a radar sensor, in centimeter or millimeter band. Certain frequency bands chosen outside the peaks of water vapor absorption have very low sensitivity to severe weather conditions. Such sensors therefore make it possible to produce an image through fog, for example. However, even though these sensors have fine range resolution, they exhibit much coarser angular resolution than optical solutions. The resolution is directly related to the size of the antennas used, and it is often too coarse to obtain precise positioning of the airstrip at a sufficient distance to perform the realignment maneuvers.
- Figure 1 illustrates a landing phase piloting symbology for a head-up display of an EVS system. The conformity of symbols depends primarily on the accuracy of the aircraft attitude data.
- An object of the invention is to overcome the drawbacks of the known techniques by meeting the aforementioned needs by a solution for assisting the landing of aircraft, in particular an aid in visual identification for the pilot before reaching decision height. (DH) or decision altitude (DA).
- DH decision height
- DA decision altitude
- a computer-implemented method for assisting aircraft landing in conditions of degraded visibility comprising at least the steps of:
- the data reception step consists of receiving data from a sensor on board the aircraft and looking forward, said sensor being chosen from the group of FLIR piloting type sensors, multispectral camera ,
- the step of determining data of interest consists in executing an artificial intelligence algorithm on the received sensor data, the algorithm implementing an artificial intelligence model trained for image processing, obtained during a learning phase through deep learning.
- - deep learning is based on convolutional neural networks.
- the step of calculating a target area consists in determining, from the characteristics of the sensor and the attitude of the aircraft corresponding to the sensor data, the heading and elevation coordinates of the target area.
- the method comprises after the step of calculating a target area, a step of sending the coordinates of the target area to the head-up display device.
- the coordinates of the target area correspond to two opposite corners of a rectangle surrounding the data of interest characteristic of said landing runway and / or said approach ramp, and in which the compliant symbol which is displayed is said surrounding rectangle.
- the head-up display step consists in displaying the framing symbol on a fixed head-up screen in the cockpit and / or on a head-up screen worn by the pilot.
- the invention also covers a computer program product comprising code instructions making it possible to perform the steps of the method for assisting in aircraft landing, in particular in conditions of degraded visibility, as claimed, when the program is run on a computer.
- the invention further covers a device for assisting the landing of an aircraft, in particular in conditions of degraded visibility, the device comprising means for implementing the steps of the method for assisting in landing d aircraft in degraded visibility conditions according to any one of the claims.
- the data allowing the calculation of the target area come from a first sensor, the device further comprising a second sensor capable of providing an image that can be displayed in the head-up device carried by the device. pilot, the compliant symbol calculated from the data of the first sensor being displayed on said image supplied by the second sensor.
- Another object of the invention is a man-machine interface comprising means for displaying a compliant symbol obtained according to the claimed method.
- Another object of the invention is a landing aid system, in particular of the SVS, SGVS, EVS, EFVS or CVS type carrying an aircraft landing aid device, in particular in conditions degraded visibility, as claimed.
- the invention also addresses an aircraft comprising an aircraft landing aid device, in particular in conditions of degraded visibility, as claimed.
- FIG.2 a method of assisting the landing of an aircraft making it possible to obtain a compliant symbol for a head-up display, according to one embodiment of the invention
- FIG.3 a head-up display of an EVS system with the display of a compliant symbol obtained by the method of the invention
- FIG.4 a display of a symbol according to the invention on an IR image
- FIG.5 a general architecture of a display system for implementing the method of the invention.
- FIG. 2 illustrates the steps of a method 200 for assisting the landing of an aircraft, making it possible to obtain a compliant symbol for a head-up display, according to one embodiment of the invention.
- the method begins upon receipt 202 of sensor data, from a sensor on board an aircraft and looking forward.
- the method of the invention applies to any type of sensor, whether it is a pilot FLIR providing an IR image, a multispectral camera, a LIDAR, or a millimeter radar.
- the technical problem that the invention solves is that of aid in the detection of the landing strip of an aircraft by the pilot in a sensor image or in direct vision before descending below the decision height, especially in degraded visibility conditions.
- the invention allows the display of a new compliant symbol generated from an automatic detection technique of the approach ramp or the landing runway in data from a sensor (sensor data).
- the symbol displayed is perfectly consistent with the outside world and indicates to the pilot the zone where the runway and / or the approach ramp will appear before they are visible to the pilot with the naked eye in direct vision.
- the method makes it possible to determine in the received sensor data, data of interest which are characteristic of the landing runway and / or of the ramp. 'approach.
- the sensor can be an IR or multispectral sensor whose image is presented to the pilot or be a second sensor of the active type, in principle more efficient than an IR sensor, such as for example a millimeter radar, but whose data is not displayable to the pilot because they are difficult to interpret.
- the determination of data of interest consists in implementing a conventional algorithm for detecting lines and patterns.
- the Applicant's patent application FR3049744 describes an example of such a conventional detection algorithm.
- the algorithm consists in calculating a bounding box of the elements of interest detected, in the form of a rectangle whose coordinates in pixels of two opposite corners correspond respectively to the smallest X coordinate and in Y is the smallest among the pixels belonging to the detected elements, and the largest X and Y coordinate among the pixels belonging to the detected elements.
- the area of the rectangle can be increased by a few percent, for example 10%, while remaining centered on the initial rectangle.
- the step of determining data of interest consists in executing an artificial intelligence algorithm on the received sensor data, the algorithm implementing an artificial intelligence model for processing 'images trained for landing runway and approach ramp detection.
- the trained model is a model on board the aircraft for operational use, which was obtained during a learning phase, and in particular by deep learning by artificial neural network for runway and ramp detection.
- the artificial neural network is a convolutional neural network (CNN for “Convolutional Neural Network”).
- a classical model based on CNN can be set up for the detection and segmentation of track and ramp, for example by using a mask-RCNN (Regions with CNN features) - resNet 101 (101 layers) architecture ) [Mask R-CNN - Kaiming et al. 2017] From this model, a transfer learning (then finer learning) can be carried out to adapt to the track and ramp use case.
- a mask-RCNN Regions with CNN features
- resNet 101 101 layers
- the goal of deep learning is to model with a high level of data abstraction.
- the learning phase defines and generates a trained AI model that meets the operational need. This model is then used in the operational context during the inference phase.
- the learning phase is therefore essential. Learning is considered efficient if it allows a predictive model to be defined that adapts well to learning data but is also able to predict well on data that was not seen during learning. If the model does not fit the training data, the model suffers from under-training. If the model adapts too well to the training data and is not able to generalize, the model suffers from overfitting.
- the learning phase requires having collected a large database that is the most representative of the operational context and having labeled them with regard to ground truth (VT).
- VT ground truth
- ground truth in English, is a reference image which represents an expected result after a segmentation operation.
- the ground truth of an image represents at least one track and an approach ramp as well as the visible ground.
- the result of a segmentation of an image is compared with the reference image or ground truth in order to evaluate the performance of the classification algorithm.
- the learning phase makes it possible to define the architecture of the neural network and the associated hyper-parameters (the number of layers, the types of layers, the learning step, etc. .), then to seek by successive iteration, the best parameters (the weightings of the layers and between the layers) which best model the different labels (track / ramp).
- the neural network propagates (extract / abstract characteristics specific to the objects of interest) and estimates the presence and position of the objects. From this estimate and the ground truth, the learning algorithm calculates a prediction error and backpropagation in the network in order to update the model parameters.
- a training database must contain a very large number of data representing a maximum of possible situations, including for the context of the invention, different approaches on different tracks with different light ramps of approaches for different weather conditions .
- the database which is constituted contains a plurality of labeled or labeled datasets, where each set of labeled data corresponds to a pair (sensor data, ground truth VT).
- a VT ground truth for the operational context of the present invention is a description of various elements of interest to be recognized in the sensor data, including at least one runway and one approach ramp.
- the method makes it possible to calculate an area in which the approach light strip and / or the landing runway have been detected.
- the target area is calculated as being a rectangle surrounding the identified elements of interest. From the characteristics of the sensor and the attitude of the aircraft corresponding to the sensor data, the method allows to calculate the coordinates in heading and in elevation of two opposite corners of a surrounding rectangle.
- the coordinates are sent to a head-up display device (or head-down on a head-down EVS or CVS device), carried or not, and the area is displayed in a step following (208) as a symbol conforming to the outside world, in the form of a surrounding rectangle.
- Figure 3 illustrates a head-up display of an EVS system with the display of a conforming symbol (302) obtained by the method of the invention.
- the display of the symbol makes it possible to validate the trajectory towards the landing strip before the visual acquisition thereof by the pilot.
- the display of the symbol thus helps the pilot in acquiring the mandatory visual references before the DH since he knows he must search inside the rectangle.
- the head-up device can also display an SVS, an EVS or a CVS.
- the sensor image displayed is the one that fed the search for I ⁇ A.
- the method makes it possible to display a head-up IR image in which the pilot searches for his visual references aided by a framing symbol (402), for example a rectangle originating from the detection of the landing strip by the CNN model or by any other runway and ramp detection algorithm, on data from an active sensor, for example a millimeter radar.
- a framing symbol for example a rectangle originating from the detection of the landing strip by the CNN model or by any other runway and ramp detection algorithm
- an active sensor for example a millimeter radar.
- the aircraft benefits from the lowering of the EFVS landing minima.
- the onboard sensor is a simple visible camera and its image is not presented to the pilot. Only the compliant symbol resulting from the method of the invention is presented to the pilot then providing him with an aid in the visual detection of the runway, for example during visual flights (VFR for “View Flight Rules” in English) but with reduced visibility. .
- VFR Visual Flight Rules
- FIG. 5 illustrates a general architecture of a display system 500 making it possible to implement the method of the invention.
- a validated AI model (architecture and the learned hyper-parameters) is integrated into a system on board an aircraft which comprises at least one sensor of the same type as that used for the learning.
- the on-board system 500 also comprises a field database (BDT) 502, a database of elements of interest (BDEI) 504, a module for generating a synthetic view 506 in 3D towards the front of the aircraft (SVS) from the position and attitude of the aircraft received by sensors 508, and sensors 510, an analysis module 512 comprising at least one validated AI model, and a device for SVS display 514 for the aircraft crew.
- the display device 514 may be a head-down display (HDD), a transparent head-up display (HUD), a transparent head-mounted display (HWD), the windshield of the 'aircraft.
- HDD head-down display
- HUD transparent head-up display
- HWD transparent head-mounted display
- the usual piloting symbology presenting the piloting parameters of the aircraft is superimposed on the synthetic 3D view.
- the analysis module 512 can be configured to correct the airstrip position shown on the SVS 506.
- the invention can be implemented from hardware and / or software elements.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Evolutionary Computation (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Artificial Intelligence (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Data Mining & Analysis (AREA)
- Bioinformatics & Cheminformatics (AREA)
- General Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1912486A FR3103050B1 (en) | 2019-11-07 | 2019-11-07 | AIRCRAFT LANDING ASSISTANCE PROCESS AND DEVICE IN CONDITIONS OF DEGRADED VISIBILITY |
PCT/EP2020/080807 WO2021089539A1 (en) | 2019-11-07 | 2020-11-03 | Method and device for assisting in landing an aircraft under poor visibility conditions |
Publications (1)
Publication Number | Publication Date |
---|---|
EP4055343A1 true EP4055343A1 (en) | 2022-09-14 |
Family
ID=70154467
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP20797523.6A Pending EP4055343A1 (en) | 2019-11-07 | 2020-11-03 | Method and device for assisting in landing an aircraft under poor visibility conditions |
Country Status (4)
Country | Link |
---|---|
US (1) | US20220373357A1 (en) |
EP (1) | EP4055343A1 (en) |
FR (1) | FR3103050B1 (en) |
WO (1) | WO2021089539A1 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11866194B2 (en) * | 2021-10-30 | 2024-01-09 | Beta Air, Llc | Systems and methods for a visual system for an electric aircraft |
US20240101273A1 (en) * | 2022-09-26 | 2024-03-28 | Rockwell Collins, Inc. | Pilot alerting of detected runway environment |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8264498B1 (en) * | 2008-04-01 | 2012-09-11 | Rockwell Collins, Inc. | System, apparatus, and method for presenting a monochrome image of terrain on a head-up display unit |
FR2947083B1 (en) * | 2009-06-23 | 2011-11-11 | Thales Sa | DEVICE AND METHOD FOR LANDFILLING |
US20130041529A1 (en) * | 2011-08-11 | 2013-02-14 | Honeywell International Inc. | Aircraft vision system having redundancy for low altitude approaches |
US9280904B2 (en) * | 2013-03-15 | 2016-03-08 | Airbus Operations (S.A.S.) | Methods, systems and computer readable media for arming aircraft runway approach guidance modes |
AU2016315938B2 (en) * | 2015-08-31 | 2022-02-24 | Cape Analytics, Inc. | Systems and methods for analyzing remote sensing imagery |
FR3049744B1 (en) | 2016-04-01 | 2018-03-30 | Thales | METHOD FOR SYNTHETICALLY REPRESENTING ELEMENTS OF INTEREST IN A VISUALIZATION SYSTEM FOR AN AIRCRAFT |
FR3058233B1 (en) * | 2016-11-03 | 2018-11-16 | Thales | METHOD FOR OVERLAYING AN IMAGE FROM A SENSOR ON A SYNTHETIC IMAGE BY AUTOMATICALLY DETECTING THE VISIBILITY LIMIT AND VISUALISION SYSTEM THEREOF |
-
2019
- 2019-11-07 FR FR1912486A patent/FR3103050B1/en active Active
-
2020
- 2020-11-03 WO PCT/EP2020/080807 patent/WO2021089539A1/en unknown
- 2020-11-03 US US17/775,225 patent/US20220373357A1/en active Pending
- 2020-11-03 EP EP20797523.6A patent/EP4055343A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
US20220373357A1 (en) | 2022-11-24 |
FR3103050B1 (en) | 2021-11-26 |
WO2021089539A1 (en) | 2021-05-14 |
FR3103050A1 (en) | 2021-05-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP0678841B1 (en) | Landing aid device | |
US20210158157A1 (en) | Artificial neural network learning method and device for aircraft landing assistance | |
US9086484B2 (en) | Context-based target recognition | |
EP3657213B1 (en) | Learning method of a neural network on-board an aircraft for landing assistance of said aircraft and server for implementing such a method | |
EP3226062B1 (en) | Method for synthetic representation of elements of interest in a display system for an aircraft | |
CN105158762A (en) | Identifying and tracking convective weather cells | |
EP1870789A1 (en) | System for detecting obstacles in the proximity of a landing point | |
FR3103048A1 (en) | PROCESS AND DEVICE FOR GENERATING SYNTHETIC LEARNING DATA FOR ARTIFICIAL INTELLIGENCE MACHINE FOR AIRCRAFT LANDING AID | |
US20220128700A1 (en) | Systems and methods for camera-lidar fused object detection with point pruning | |
FR2908218A1 (en) | DEVICE FOR AIDING NAVIGATION OF AN AIRCRAFT IN AN AIRPORT AREA | |
FR3003989A1 (en) | METHOD FOR LOCATING AND GUIDING A VEHICLE OPTICALLY IN RELATION TO AN AIRPORT | |
Nagarani et al. | Unmanned Aerial vehicle’s runway landing system with efficient target detection by using morphological fusion for military surveillance system | |
EP4055343A1 (en) | Method and device for assisting in landing an aircraft under poor visibility conditions | |
FR3054357A1 (en) | METHOD AND DEVICE FOR DETERMINING THE POSITION OF AN AIRCRAFT DURING AN APPROACH FOR LANDING | |
FR3077393A1 (en) | Aerial vehicles with artificial vision | |
EP3656681A1 (en) | Device and method for assisting in the landing of an aircraft in conditions of reduced visibility | |
EP4055349A1 (en) | Method and device for generating learning data for an artificial intelligence machine for aircraft landing assistance | |
EP2150776A1 (en) | Display device for an aircraft including means for displaying a navigation symbology dedicated to obstacle avoidance | |
FR2808588A1 (en) | Method of determination of position of a vehicle relative to an ideal trajectory, uses image data collection devices mounted on vehicle to read markers installed alongside desired trajectory | |
FR3033903A1 (en) | NAVIGATION ASSISTANCE SYSTEM FOR AN AIRCRAFT WITH HIGH HEAD DISPLAY SCREEN AND CAMERA. | |
Bharti et al. | Neural Network Based Landing Assist Using Remote Sensing Data | |
US20220309786A1 (en) | Method for training a supervised artificial intelligence intended to identify a predetermined object in the environment of an aircraft | |
EP3578926B1 (en) | Securing method of processing of an aircraft synthetic vision system, associated system and computer program product | |
Liu et al. | Runway detection during approach and landing based on image fusion | |
Korn et al. | Pilot assistance systems: Enhanced and synthetic vision for automatic situation assessment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20220505 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230427 |
|
17Q | First examination report despatched |
Effective date: 20230602 |