US20190233101A1 - Aerial vehicles with machine vision - Google Patents
Aerial vehicles with machine vision Download PDFInfo
- Publication number
- US20190233101A1 US20190233101A1 US16/240,913 US201916240913A US2019233101A1 US 20190233101 A1 US20190233101 A1 US 20190233101A1 US 201916240913 A US201916240913 A US 201916240913A US 2019233101 A1 US2019233101 A1 US 2019233101A1
- Authority
- US
- United States
- Prior art keywords
- aerial vehicle
- sensor data
- geographic
- determining
- indicator
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 claims abstract description 62
- 238000012545 processing Methods 0.000 claims abstract description 60
- 238000004891 communication Methods 0.000 claims abstract description 18
- 239000013598 vector Substances 0.000 claims description 28
- 238000013459 approach Methods 0.000 claims description 24
- 230000003287 optical effect Effects 0.000 claims description 15
- 230000000007 visual effect Effects 0.000 claims description 10
- 230000004044 response Effects 0.000 claims description 4
- 238000001514 detection method Methods 0.000 claims description 3
- 238000005516 engineering process Methods 0.000 description 16
- 238000010586 diagram Methods 0.000 description 9
- 230000008569 process Effects 0.000 description 8
- 230000008901 benefit Effects 0.000 description 6
- 230000006870 function Effects 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- 230000002411 adverse Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000012141 concentrate Substances 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 231100001261 hazardous Toxicity 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0017—Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information
- G08G5/0021—Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information located in the aircraft
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C39/00—Aircraft not otherwise provided for
- B64C39/02—Aircraft not otherwise provided for characterised by special use
- B64C39/024—Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D45/00—Aircraft indicators or protectors not otherwise provided for
- B64D45/04—Landing aids; Safety measures to prevent collision with earth's surface
-
- G06K9/0063—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/13—Satellite images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/17—Terrestrial scenes taken from planes or by drones
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0047—Navigation or guidance aids for a single aircraft
- G08G5/0069—Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0073—Surveillance aids
- G08G5/0086—Surveillance aids for monitoring terrain
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/02—Automatic approach or landing aids, i.e. systems in which flight data of incoming planes are processed to provide landing data
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/02—Automatic approach or landing aids, i.e. systems in which flight data of incoming planes are processed to provide landing data
- G08G5/025—Navigation or guidance aids
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/06—Traffic control systems for aircraft, e.g. air-traffic control [ATC] for control when on the ground
- G08G5/065—Navigation or guidance aids, e.g. for taxiing or rolling
-
- B64C2201/021—
-
- B64C2201/141—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D47/00—Equipment not otherwise provided for
- B64D47/08—Arrangements of cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U10/00—Type of UAV
- B64U10/25—Fixed-wing aircraft
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/10—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
Definitions
- the present subject matter relates generally to aerial vehicles and, in particular, to aerial vehicles and other aircraft with machine vision.
- a geographic location of an aircraft relative to a desired landing position or area such as a runway, landing pad, or other suitable landing area.
- a desired landing position or area such as a runway, landing pad, or other suitable landing area.
- the position of the aircraft relative to the desired landing area aids in successfully landing the aircraft.
- understanding an approach vector during a landing attempt may further aid in safely landing the aircraft.
- GPS global positioning systems
- Many of these systems are dependent on resources external to the aircraft and communication with the aircraft, making them subject to loss of utility if communication with a pilot is lost. Additionally, many GPS-based systems will not provide any helpful information with regards to obstructions, unsafe landing conditions, or misidentification of a runway if a general position (based on GPS) is correct.
- an aerial vehicle can include a plurality of sensors mounted thereon, an avionics system configured to operate at least a portion of the aerial vehicle, and a machine vision controller in operative communication with the avionics system and the plurality of sensors.
- the machine vision controller can be configured to perform a method.
- the method can include obtaining sensor data from at least one sensor of the plurality of sensors.
- the sensor data is associated with a proximal location of the aerial vehicle and the proximal location is within sensor range of the aerial vehicle.
- the method can also include determining performance data associated with the aerial vehicle, processing the sensor data based on the performance data to compensate for movement of the unmanned aerial vehicle, identifying at least one geographic indicator based on processing the sensor data, and determining a geographic location of the aerial vehicle based on the at least one geographic indicator.
- an unmanned aerial vehicle can include a plurality of sensors mounted thereon, an avionics system configured to operate at least a portion of the unmanned aerial vehicle, and a machine vision controller in operative communication with the avionics system and the plurality of sensors.
- the machine vision controller can be configured to perform a method.
- the method can include obtaining sensor data from at least one sensor of the plurality of sensors.
- the sensor data is associated with a proximal location of the unmanned aerial vehicle and the proximal location is within sensor range of the unmanned aerial vehicle.
- the method can also include identifying at least one geographic indicator based on processing the sensor data, determining whether the at least one geographic indicator is associated with a flight plan of the unmanned aerial vehicle, and performing an automatic maneuver of the unmanned aerial vehicle based on the determining.
- a method of locating an aerial vehicle relative to a proximal location can include obtaining sensor data from one or more optical sensors mounted on the aerial vehicle.
- the sensor data is associated with the proximal location and the proximal location is within sensor range of the aerial vehicle.
- the method can also include determining performance data from one or more additional sensors mounted on the aerial vehicle, processing the sensor data based on the performance data to compensate for movement of the unmanned aerial vehicle, identifying at least one geographic indicator based on processing the sensor data, and determining a geographic location of the aerial vehicle based on the at least one geographic indicator.
- FIG. 1 is a schematic illustration of an example runway or landing area for an aerial vehicle.
- FIG. 2A is a schematic illustration of an aerial vehicle approaching a portion of the landing area of FIG. 1 .
- FIG. 2B is a schematic illustration of example sensor data obtained by the aerial vehicle of FIG. 2A .
- FIG. 3A is a schematic illustration of an additional aerial vehicle approaching a portion of the landing area of FIG. 1 .
- FIG. 3B is a schematic illustration of example sensor data obtained by the aerial vehicle of FIG. 3A .
- FIG. 4 is a diagram of an example aerial vehicle, according to example embodiments of the present disclosure.
- FIG. 5 is a flow diagram of a method of machine vision processing of sensor data of an aerial vehicle, according to example embodiments of the present disclosure.
- FIG. 6 is a flow diagram of an additional method of machine vision processing of sensor data of an aerial vehicle, according to example embodiments of the present disclosure.
- FIG. 7 is a block diagram of an example computing system that can be used to implement methods and systems according to example embodiments of the present disclosure.
- a machine vision controller for an aerial vehicle or aircraft.
- the machine vision controller may execute computer-readable program instructions for: processing sensor data, identifying geographic identifiers in sensor data, comparing obtained sensor data to desired sensor data, identifying geographic location from sensor data, automatically maneuvering the aerial vehicle responsive to sensor data and other data, determining if an approach vector is safe for landing, reducing risk associated with instrumentation-only landings, and/or other program instructions.
- the machine vision controller may be integrated within an aerial vehicle or may be a remote controller configured to transmit calculated data to the aerial vehicle. Additionally, the machine vision controller may also be configured to provide warnings, both audial and visual, to operators of the aerial vehicle, to aid in a plurality of aircraft maneuvers.
- Embodiments of the disclosed technology provide a number of technical benefits and advantages, particularly in the area of aircraft safety.
- the disclosed technology provides for safer landings by ensuring an aerial vehicle is approaching an appropriate runway or airport.
- the disclosed technology can also aid in automatic avoidance maneuvers through identifying geographic indicators. By identifying geographic indicators and matching geographic indicators to a map or model of an expected proximal area, the disclosed systems can avoid initiating a landing operation if an area is unexpected or if obstacles exist.
- Embodiments of the disclosed technology additionally provide a number of technical benefits and advantages in the area of computing technology.
- the disclosed system can obtain sensor data associated with performance data of an aerial vehicle and automatically determine whether performance of an aircraft is affecting a landing approach.
- a computing system implemented in accordance with the disclosed technology can therefore avoid costly landing maneuvers in adverse conditions or implement safer landing procedures through assuring an actual approach vector is correct for a particular aerial vehicle.
- FIG. 1 is a schematic illustration of an example landing area 2 for an aerial vehicle.
- the landing area 2 can be aligned with a runway 4 .
- the runway 4 may include several identifying markers painted thereon, including several indicia.
- the runway 4 can include runway threshold markings 6 , runway designation markings 8 , runway aiming point markings 10 , runway touchdown zone markings 12 , runway centerline markings 14 , runway side stripe markings 16 , runway lighting 18 , taxiway markings including taxiway centerline 20 , taxiway edge marking 22 , taxiway lighting 24 , holding position markings 26 , holding position sign 28 , and holding position sign 30 .
- the identifying markers and indicia may be processed by the machine vision controllers described herein, to aid in both taxying an aerial vehicle prior to takeoff and safely landing an aerial vehicle during/subsequent to a landing approach.
- an aerial vehicle may not have full visibility of some or all of the identifying markers outlined above.
- sensor data such as optical data, video data, photograph data, radar data, and/or Light Detection and Ranging (LIDAR) data.
- LIDAR Light Detection and Ranging
- the sensor data can subsequently be processed to determine if an aerial vehicle can safely land or if an avoidance maneuver is appropriate. Additionally, under some circumstances, the sensor data can be compared to a map or model of a proximal area near the aircraft.
- geographic indicators including buildings, landmarks, topographical features, runways, runway markings, and other geographic indicators can be identified to determine that the aerial vehicle is operating or attempting to land in a safe or expected area.
- scenarios involving an aerial vehicle approaching the runway 4 are described in detail with reference to FIG. 2A , FIG. 2B , FIG. 3A , and FIG. 3B .
- FIG. 2A is a schematic illustration of an aerial vehicle 200 approaching a portion 215 of the landing area of FIG. 1 .
- taxiway 40 and taxiway 42 with geographic position markings 44 including a direction sign 46 and a location sign 48 are presented on the portion 215 .
- the aerial vehicle 200 is approaching the portion 215 at approach vector 202 .
- the aerial vehicle 200 may obtain a variety of sensor data.
- FIG. 2B is a schematic illustration of example sensor data 210 obtained by the aerial vehicle 200 .
- the direct approach vector 202 allows the sensor data 210 to appear relatively undistorted.
- a machine vision controller associated with the aerial vehicle 200 can process the sensor data 210 to determine that a direction indicator 46 and position markings 44 are appropriately located relative to the aerial vehicle 200 .
- the machine vision controller may generate one or more displays for a pilot or other operator providing the direction indicator 46 , position markings 44 , and/or a representation of any other geographic indicator.
- the machine vision controller may utilize various image processing and/or machine vision processing techniques to identify objects using image data.
- Physics-based modeling and/or machine learned models may be used for object detection.
- Objects may be detected and classified using various image and/or machine vision processing techniques.
- the aerial vehicle 200 can safely continue its approach and complete landing maneuvers. However, if the aerial vehicle 200 and associated machine vision controller determine that the approach vector 202 is inappropriate or misaligned, the machine vision controller associated with the aerial vehicle 200 can provide warnings, indications, and other data to a pilot or operator to ensure the pilot is aware of the situational position and approach of the aerial vehicle 200 .
- the warnings may be used to determine that a landing cannot continue due to obstructions, misaligned vectors, incorrect geographical areas, or other misinformation. Accordingly, the aerial vehicle 200 may be equipped to efficiently correct the misinformation, alter a flight plan, and/or avoid a landing to ensure the aerial vehicle 200 lands at a correct geographic location, safely. Under other circumstances, typical machine vision processing of the sensor data 210 may be inappropriate due to a variety of issues, such as weather or approach vectors differing from the thrust vector of an aerial vehicle.
- aircraft performance data may be used to selectively trigger alerts or displays that are generated based on the sensor data such as image data.
- the machine vision controller may obtain altitude and/or aircraft configuration data from the performance data and use this information to selectively generate geographic based alerts.
- the machine vision controller may determine from the performance data whether the aircraft is below a certain altitude and/or whether the aircraft landing gear is down. If the performance data indicates that the aircraft is attempting a landing, displays may be generated based on the geographic indicators. For example, runway markers and other identifiers may be displayed if the performance data indicates that the aircraft is landing. If, however, the performance data indicates that the aircraft is not attempting a landing, alerts or displays that would otherwise be generated in response to geographic indicators may not be displayed.
- FIG. 3A is a schematic illustration of aerial vehicle 200 approaching the portion 215 of the landing area of FIG. 1 .
- the aerial vehicle 200 has a thrust vector 302 in an attempt to correct for crosswind 306 that is flowing perpendicular to the portion 215 of the landing area of FIG. 1 .
- the thrust vector 302 is generally directed away from the landing area 215
- the aerial vehicle 200 is actually on an approach vector 304 . Due to the direction of the aerial vehicle being out of perfect alignment with the landing area 215 , sensor data obtained by the aerial vehicle 200 may be distorted or difficult to process.
- FIG. 3B is a schematic illustration of example sensor data 310 obtained by the aerial vehicle 200 of FIG. 3A on thrust vector 302 .
- the sensor data 310 appears to point towards a different landing area as compared to sensor data 210 .
- a machine vision controller associated with the aerial vehicle 200 can process the sensor data 310 to compensate for movement and/or performance data of the aerial vehicle 200 .
- the machine vision controller can determine a crosswind value 306 and performance data including the thrust vector 302 .
- the machine vision controller can also estimate the approach vector 304 based on the crosswind value 306 and the performance data.
- the machine vision controller can translate the sensor data 310 into 320 based on the approach vector 304 . Therefore, the machine vision controller can correct the apparent misalignment or distortion based on the thrust vector 302 , approach vector 304 , and crosswind 306 , resulting in sensor data 320 .
- Sensor data 320 is sufficiently similar to sensor data 210 and allows for successful identification of the portion 215 , including all visual markers illustrated thereon. Additionally, under some circumstances, the sensor data 320 can be compared to a map or model of a proximal area near the aerial vehicle 200 .
- the map or model can include a two-dimensional mapping of information with associated height, depth, or other dimensional information, for example, of surrounding buildings and landmarks. Upon comparing, geographic indicators including buildings, landmarks, topographical features, runways, runway markings, and other geographic indicators can be identified to determine that the aerial vehicle 200 is operating or attempting to land in a safe or expected area.
- aerial vehicle 200 may include a machine vision controller configured to process sensor data, identify geographic identifiers in sensor data, compare obtained sensor data to desired sensor data, identify geographic location from sensor data, automatically maneuver the aerial vehicle responsive to sensor data and other data, determine if an approach vector is safe for landing, and/or reduce risk associated with instrumentation-only landings. Furthermore, under some circumstances, the machine vision controller can predict a flight path of the aerial vehicle 200 based on processing the sensor data. Flight path prediction may include an estimated or predicted flight path based on any of a thrust vector, crosswind, axial alignment based on a ground plane reference, instrumentation values, and other aerial vehicle performance data.
- the machine vision controller may be integrated within the aerial vehicle 200 or may be a remote controller configured to transmit calculated data to the aerial vehicle.
- the machine vision controller may also be configured to provide warnings, both audial and visual, to operators of the aerial vehicle, to aid in a plurality of aircraft maneuvers.
- a detailed description of an example aerial vehicle 200 having an integrated machine vision controller 402 is provided.
- FIG. 4 is a diagram of an example aerial vehicle 200 , according to example embodiments of the present disclosure.
- the aerial vehicle 200 can include a machine vision controller 402 configured for sensor data processing as described herein.
- the machine vision controller 402 can be a standalone controller, customized controller, or can be a generic computer controller configured to process sensor data as a machine vision processor.
- the controller 402 can be configured to obtain sensor data from a first sensor array 404 .
- the sensor array 404 may be an externally mounted sensor array or an internal mounted sensor array.
- the sensor array 402 can include one or more of optical sensors, camera sensors, acoustic sensors, radar sensors, laser sensors, and/or LIDAR sensors.
- Optical sensors can include visible light sensors, infrared sensors, and other optical sensors tuned to more or fewer frequencies. For example, optical sensors that can sense and encode infrared frequencies may have better performance characteristics in low-light and inclement weather conditions as compared to typical human sight.
- Other appropriate sensors may also be included in the sensor array 404 , according to any desired implementation of the aerial vehicle 200 .
- the controller 402 can also be configured to obtain sensor data from a second sensor array 406 .
- the sensor array 406 may be an externally mounted sensor array or an internal mounted sensor array.
- the sensor array 406 can include one or more of optical sensors, camera sensors, acoustic sensors, radar sensors, laser sensors, and/or LIDAR sensors.
- Optical sensors can include visible light sensors, infrared sensors, and other optical sensors tuned to more or fewer frequencies. Other appropriate sensors may also be included in the sensor array 406 , according to any desired implementation of the aerial vehicle 200 .
- the controller 402 may be configured to communicate with external processors, controllers, and/or ground equipment and other aircraft through interface 408 .
- the interface 408 may be a standardized communications interface configured to send and receive data via antenna array 410 . It is noted that although particularly illustrated as an externally mounted antenna array on the upper surface of the aerial vehicle 200 , that any form of antenna array may be applicable.
- the controller 402 may also be configured to communicate with avionics system 412 of the aerial vehicle 200 over communications bus 450 .
- the controller 402 may provide appropriate information to avionics system 412 such that landing gear 422 is controlled up/down based on machine vision processing.
- the controller 402 may provide appropriate information to avionics system 412 such that thrust from engines 424 is controlled based on machine vision processing.
- the controller 402 may provide appropriate information to avionics system 412 such that control surfaces 426 are adjusted for automatic maneuvers such as landing denials, obstacle avoidance, safety maneuvers, and other movement of the aerial vehicle 200 .
- the aerial vehicle 200 may include more or fewer control components than those particularly illustrated. Furthermore, the aerial vehicle 200 may include several components, aspects, and necessary structures not particularly illustrated herein for the sake of brevity, clarity, and concise disclosure of the example embodiments described herein. Hereinafter, operational details of the aerial vehicle 200 are presented with reference to FIG. 5 and FIG. 6 .
- the controller and avionics system may generally include one or more processor(s) and associated memory configured to perform a variety of computer-implemented functions, such as various methods, steps, calculations and the like disclosed herein.
- the controller and/or avionics system may be programmable logic devices, such as a Field Programmable Gate Array (FPGA), however they may be implemented using any suitable hardware and/or software.
- FPGA Field Programmable Gate Array
- processor may generally refer to integrated circuits, and may also refer to a controller, microcontroller, a microcomputer, a programmable logic controller (PLC), an application specific integrated circuit (ASIC), a Field Programmable Gate Array (FPGA), and other programmable circuits.
- memory described herein may generally include memory element(s) including, but not limited to, computer readable medium (e.g., random access memory (RAM)), computer readable non-volatile medium (e.g., flash memory), a compact disc-read only memory (CD-ROM), a magneto-optical disk (MOD), a digital versatile disc (DVD) and/or other suitable memory elements or combinations thereof.
- RAM random access memory
- CD-ROM compact disc-read only memory
- MOD magneto-optical disk
- DVD digital versatile disc
- the flight management system and vehicle control system may also include a communications interface.
- the communications interface can include associated electronic circuitry that is used to send and receive data. More specifically, the communications interface can be used to send and receive data between any of the various systems. Similarly, a communications interface at any one of the systems may be used to communicate with outside components such as another aerial vehicle and/or ground control.
- a communications interface may be any combination of suitable wired or wireless communications interfaces.
- FIG. 5 is a flow diagram of a method 500 of machine vision processing of sensor data of an aerial vehicle
- FIG. 6 is a flow diagram of a method 600 of machine vision processing of sensor data of an aerial vehicle, according to example embodiments of the present subject matter.
- system components such as controller 402 , which can comprise a processor, control application, machine vision circuitry, machine vision components and/or machine vision applications.
- system components include functionality produced by an application programming interface (API), a compiled program, an interpreted program, a network service, a script, or any other executable set of instructions.
- API application programming interface
- the operations of the methods 500 and 600 may be also implemented in many other ways.
- the methods 500 and 600 may be implemented, at least in part, by a processor of a remote controller or a local circuit, including a remote pilot control center for controlling an unmanned aerial vehicle.
- one or more of the operations of the methods 500 and 600 may alternatively or additionally be implemented, at least in part, by a chipset working alone or in conjunction with software modules on board the aerial vehicle 200 .
- Any service, circuit or application suitable for providing the techniques disclosed herein can be used in operations described herein, either remote to, onboard, or a combination thereof with respect to the aerial vehicle 200 .
- the method 500 includes obtaining sensor data from a sensor array 404 or 406 , at block 502 .
- the sensor data can include, for example, optical data or radar data obtained from at least one sensor of the sensor arrays 404 and 406 .
- Optical sensor data can also be obtained, and can include stereoscopic images, still images, or video recorded from an optical sensor of the sensor arrays 404 and 406 .
- Other sensor data can include altitude, distance from ground, and other suitable data.
- the method 500 further includes obtaining performance data from avionics 412 and/or sensor array 404 or 406 , at block 504 .
- the performance data can include, for example, a calculated approach vector, a thrust vector, rate of ascent, rate of descent, angle relative to a ground plane, and other suitable performance data.
- the performance data is obtained from sensors and/or avionics system 412 .
- the performance data may be obtained directly from sensors and/or avionics system 412 or may be derived from sensor data obtained from the sensors.
- the performance data is input manually.
- the performance data may be based on positions of control surfaces and engine status provided by avionics 412 or other components of the aerial vehicle 200 .
- the method 500 further includes processing the sensor data based on the performance data, at block 506 .
- block 506 may include processing the sensor data to compensate for the performance data.
- the sensor data is processed with a machine vision controller using machine vision and/or image processing techniques.
- the sensor data is processed to compensate for movement of the aerial vehicle as determined from the performance data.
- the processing can include processing a distorted image using machine vision to create an image with reduced, minimized, or corrected distortion.
- the corrected image may include legible indicia and other corrections.
- the processing can also include processing to compensate for movement, such as by, for example, correcting perspectives of buildings, landmarks, and other features.
- the processing can include isolating landmarks, indicia, geographic features, and other optical data for further identification in subsequent or parallel machine vision processing.
- the sensor data 310 can be processed to create corrected sensor data 320 .
- the machine vision controller 402 may process the sensor data 310 to correct apparent distortion, misalignments, and other issues using the performance data and any other available data.
- the method 500 further includes identifying a geographic indicator in the processed sensor data, at block 508 .
- the machine vision controller 402 can interpret the processed sensor data to identify visual indicators such as those illustrated in FIG. 1 .
- Other geographic indicators can include, for example, a number of buildings surrounding an landing area, height of buildings surrounding a landing area, surrounding higher elevation (such as mountains or natural features), surrounding lower elevation (such as crevasses or natural features), and other geographic indicators.
- geographic indicators can include GPS data, rivers, bodies of water, physical or virtual beacons, temporary structures such as towers/cones, and other indicators.
- the method 500 further includes determining a geographic location of the aerial vehicle based on the identified geographic indicator, at block 510 .
- the machine vision controller 402 can compare the identified geographic indicators to a predetermined set of geographic indicators, such as expected geographic features or a map or model of expected features, to determine a location of the aerial vehicle 200 . For example, the machine vision controller 402 can determine if an expected control tower at an airport is detectable in the sensor data. The machine vision controller 402 can also determine if airport buildings are detectable in the sensor data. Thus, using all available sensor data, the machine vision controller 402 can compare the sensor data to expected data from a flight plan to determine the geographic location.
- a model of expected features may include depth or height information.
- the machine vision controller 402 is configured to determine a height of geographic features in a proximal location to the aerial vehicle based on processing the sensor data. For example, the machine vision controller 402 can access one or more images obtained by one or more image sensors to identify one or more geographic features. A depth camera, radar/LIDAR, and/or other sensors may be used to determine three-dimensional or height information associated with the geographic features.
- the machine vision controller 402 can access a database of predetermined geographic indicators associated with a geographic location such as known landing locations. The database can include height or other depth information associated with the predetermined geographic indicators.
- the machine vision controller 402 can compare the database, including the height of geographic indicators, in order to identify geographic features based on the image data.
- the machine vision controller 402 can determine a corresponding geographic location to a geographic feature based on comparing the height of geographic features from the sensor data to height information in the database.
- a pilot or operator of the aerial vehicle 200 can decide whether to safely land the aerial vehicle, alter a flight plan, or otherwise control the aerial vehicle. It is contemplated that visual indicators such as warnings and audio indicators such as alarms can also be used. For example, an indicator such as a visual indicator that landing can be performed safely or not safely can be provided. In some implementations, the indicator may identify a runway or other feature with which the aerial vehicle is aligned for landing. In some implementations, an indicator may include a warning that the aerial vehicle is misaligned. For example, the system may compare the geographic indicator with flight plan data and provide a warning if the geographic indicator is not associated with the flight plan data.
- a warning can be provided if geographic indicators include obstructions identified in machine vision processing.
- visual and audio indicators can include positive indications that a landing pattern is deemed safe by the machine vision controller 402 such that the pilot or operator can concentrate on other maneuvers to complete a safe landing.
- FIG. 6 is a flow diagram of an additional method 600 of machine vision processing of sensor data of an aerial vehicle, according to example embodiments of the present disclosure.
- the method 600 includes obtaining sensor data from a sensor array 404 or 406 , at block 602 .
- the sensor data may be substantially similar to the sensor data described above.
- the method 600 further includes processing the obtained sensor data to identify geographic features in the area proximal to the aerial vehicle, at block 604 .
- the processing may include correcting distortion, compensating for movement, identifying obstacles, and/or matching flight plans to expected geographic features.
- the processing can include processing a distorted image, or distorted sensor data, using machine vision to create a processed image or processed sensor data with reduced, minimized, or corrected distortion.
- the processed image may include legible indicia and other corrections.
- the processing can also include processing to compensate for movement, such as by, for example, correcting perspectives of buildings, landmarks, and other features. Additionally, the processing can include isolating landmarks, indicia, geographic features, and other optical data for further identification in subsequent or parallel machine vision processing.
- processing the sensor data includes processing image data based on aircraft misalignment.
- performance data such as crosswind and other information can be used to determine a misalignment of an aircraft approach vector with the image data obtained from one or more sensors.
- the performance data can be used to translate or otherwise determine modified sensor data to correct or compensate for the misalignment.
- the image data may identify a first runway according to a first runway marker for example.
- the performance data may indicate that the aerial vehicle is landing in a strong crosswind.
- This and/or other performance data may indicate that the aerial vehicle is misaligned with its approach vector.
- This information may be used to modify the sensor data to indicate the appropriate information corresponding with the approach.
- the machine vision controller 402 may determine that the aircraft is actually approaching a second runway adjacent to the first runway. The determination may be made even though the image data corresponds to the first runway.
- the method 600 further includes identifying a geographic indicator in the processed sensor data, at block 606 .
- the geographic indicator may include a desired indicator or an obstacle.
- the geographic indicator may include a landmark, body of water, airport, particular runway at an airport, or other geographic indicator, including all geographic indicators described herein.
- the geographic indicator can be identified in the processed sensor data.
- the machine vision controller can identify obstacles at block 606 .
- an obstacle can include any object or indicia that is not expected in a landing area. Obstacles can include temporary or permanent obstacles.
- Obstacles can further include indicia purposefully marked (e.g., such as an ‘X’ or ‘NOT SAFE FOR LANDING’) to allow pilots and operators to identify that a landing area is not to be used for landings.
- the machine vision controller 402 can immediately indicate the presence of an obstacle.
- the machine vision controller 402 can delay an alert if an obstacle is a moving obstacle, such as a landing assist vehicle, that will be clear of an active landing area prior to landing by the aerial vehicle.
- the machine vision controller 402 can issues multiple forms of soft alerts and high alerts if moving obstacles are deemed hazardous or are not moving as expected, or are not following an expected pattern. Processing and identifying obstacles can be aided through use of radar/LIDAR information, and other suitable data available from sensors.
- the method 600 further includes determining if the identified geographic indicator is in a flight plan associated with the aerial vehicle, at block 608 .
- the machine vision controller 402 may process the sensor data and compare the same to expected sensor data of the flight plan.
- the expected sensor data may include maps, models, renderings, and/or other suitable data for machine vision processing and comparison to the processed sensor data.
- the expected sensor data can include, for example, a two-dimensional map having associated height or dimensional information.
- Machine vision processing can facilitate comparisons between the processed sensor data and this expected sensor data to determine if a match exists or if the geographic indicator exists within the flight plan.
- the method 600 further includes automatically maneuvering the aerial vehicle, or providing visual indication, based on the determination, at block 610 .
- the aerial vehicle based on the determination, may be automatically maneuvered to avoid a landing or continue flying.
- Other automatic maneuvers including obstacle avoidance, fast touchdown and takeoff, and similar maneuvers based on geographical indicators are also applicable.
- the machine vision controller 402 can also provide video or audio indications, as described above.
- FIG. 7 depicts a block diagram of an example computing system 1000 that can be used to implement methods and systems according to example embodiments of the present disclosure.
- Computing system 1000 may be used to implement a machine vision controller 402 as described herein. It will be appreciated, however, that computing system 1000 is one example of a suitable computing system for implementing the machine vision controller 402 and other computing elements described herein.
- the computing system 1000 can include one or more computing device(s) 1002 .
- the one or more computing device(s) 1002 can include one or more processor(s) 1004 and one or more memory device(s) 1006 .
- the one or more processor(s) 1004 can include any suitable processing device, such as a microprocessor, microcontroller, integrated circuit, logic device, or other suitable processing device.
- the one or more memory device(s) 1006 can include one or more computer-readable media, including, but not limited to, non-transitory computer-readable media, RAM, ROM, hard drives, flash drives, or other memory devices.
- the one or more memory device(s) 1006 can store information accessible by the one or more processor(s) 1004 , including computer-readable instructions 1008 that can be executed by the one or more processor(s) 1004 .
- the instructions 1008 can be any set of instructions that when executed by the one or more processor(s) 1004 , cause the one or more processor(s) 1004 to perform operations.
- the instructions 1008 can be software written in any suitable programming language or can be implemented in hardware.
- the instructions 1008 can be executed by the one or more processor(s) 1004 to cause the one or more processor(s) 1004 to perform operations, such as the operations for machine vision processing of sensor data, identifying geographic features and indicators, locating an aerial vehicle based on sensor data, and other operations such as those described above with reference to methods 500 and 600 , and/or any other operations or functions of the one or more computing device(s) 1002 .
- operations such as the operations for machine vision processing of sensor data, identifying geographic features and indicators, locating an aerial vehicle based on sensor data, and other operations such as those described above with reference to methods 500 and 600 , and/or any other operations or functions of the one or more computing device(s) 1002 .
- the memory device(s) 1006 can further store data 1010 that can be accessed by the processors 1004 .
- the data 1010 can include sensor data such as engine parameters, model data, logic data, etc., as described herein.
- the data 1010 can include one or more table(s), function(s), algorithm(s), model(s), equation(s), etc. according to example embodiments of the present disclosure.
- the one or more computing device(s) 1002 can also include a communication interface 1012 used to communicate, for example, with the other components of system.
- the communication interface 1012 can include any suitable components for interfacing with one or more network(s), including for example, transmitters, receivers, ports, controllers, antennas, or other suitable components.
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Astronomy & Astrophysics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Traffic Control Systems (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Navigation (AREA)
Abstract
Description
- The present subject matter relates generally to aerial vehicles and, in particular, to aerial vehicles and other aircraft with machine vision.
- Generally, it is useful to know a geographic location of an aircraft relative to a desired landing position or area, such as a runway, landing pad, or other suitable landing area. During flight, the position of the aircraft relative to the desired landing area aids in successfully landing the aircraft. Furthermore, understanding an approach vector during a landing attempt may further aid in safely landing the aircraft.
- While aircraft generally have a plurality of instrumentation available to aid pilots and operators of the aircraft to safely maneuver the aircraft during flight, there may be limited instrumentation to aid in landing in other scenarios. There may be limited aid from instrumentation when a position is unknown, there is decreased visibility from a control position of the aircraft, and/or other issues in determining a position of an aircraft are apparent. As an example, limited visibility may result in a pilot misidentifying a runway or landing area. Furthermore, limited visibility may result in an airport runway obstruction going unnoticed by a pilot or support crew. Moreover, other conditions may result in misidentification of a landing area or landing in dangerous conditions.
- Current systems require complicated radar systems, global positioning systems (GPS), and communication methodologies in an attempt to avoid these and other potential issues. Many of these systems are dependent on resources external to the aircraft and communication with the aircraft, making them subject to loss of utility if communication with a pilot is lost. Additionally, many GPS-based systems will not provide any helpful information with regards to obstructions, unsafe landing conditions, or misidentification of a runway if a general position (based on GPS) is correct.
- Aspects and advantages of the disclosed technology will be set forth in part in the following description, or may be obvious from the description, or may be learned through practice of the disclosure.
- According to one example embodiment, an aerial vehicle is provided. The aerial vehicle can include a plurality of sensors mounted thereon, an avionics system configured to operate at least a portion of the aerial vehicle, and a machine vision controller in operative communication with the avionics system and the plurality of sensors. The machine vision controller can be configured to perform a method. The method can include obtaining sensor data from at least one sensor of the plurality of sensors. The sensor data is associated with a proximal location of the aerial vehicle and the proximal location is within sensor range of the aerial vehicle. The method can also include determining performance data associated with the aerial vehicle, processing the sensor data based on the performance data to compensate for movement of the unmanned aerial vehicle, identifying at least one geographic indicator based on processing the sensor data, and determining a geographic location of the aerial vehicle based on the at least one geographic indicator.
- According to another example embodiment, an unmanned aerial vehicle is provided. The unmanned aerial vehicle can include a plurality of sensors mounted thereon, an avionics system configured to operate at least a portion of the unmanned aerial vehicle, and a machine vision controller in operative communication with the avionics system and the plurality of sensors. The machine vision controller can be configured to perform a method. The method can include obtaining sensor data from at least one sensor of the plurality of sensors. The sensor data is associated with a proximal location of the unmanned aerial vehicle and the proximal location is within sensor range of the unmanned aerial vehicle. The method can also include identifying at least one geographic indicator based on processing the sensor data, determining whether the at least one geographic indicator is associated with a flight plan of the unmanned aerial vehicle, and performing an automatic maneuver of the unmanned aerial vehicle based on the determining.
- According to another example embodiment, a method of locating an aerial vehicle relative to a proximal location is provided. The method can include obtaining sensor data from one or more optical sensors mounted on the aerial vehicle. The sensor data is associated with the proximal location and the proximal location is within sensor range of the aerial vehicle. The method can also include determining performance data from one or more additional sensors mounted on the aerial vehicle, processing the sensor data based on the performance data to compensate for movement of the unmanned aerial vehicle, identifying at least one geographic indicator based on processing the sensor data, and determining a geographic location of the aerial vehicle based on the at least one geographic indicator.
- These and other features, aspects and advantages of the disclosed technology will become better understood with reference to the following description and appended claims. The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosed technology and, together with the description, serve to explain the principles of the disclosed technology.
- A full and enabling disclosure of the present disclosure, including the best mode thereof, directed to one of ordinary skill in the art, is set forth in the specification, which makes reference to the appended figures, in which:
-
FIG. 1 is a schematic illustration of an example runway or landing area for an aerial vehicle. -
FIG. 2A is a schematic illustration of an aerial vehicle approaching a portion of the landing area ofFIG. 1 . -
FIG. 2B is a schematic illustration of example sensor data obtained by the aerial vehicle ofFIG. 2A . -
FIG. 3A is a schematic illustration of an additional aerial vehicle approaching a portion of the landing area ofFIG. 1 . -
FIG. 3B is a schematic illustration of example sensor data obtained by the aerial vehicle ofFIG. 3A . -
FIG. 4 is a diagram of an example aerial vehicle, according to example embodiments of the present disclosure. -
FIG. 5 is a flow diagram of a method of machine vision processing of sensor data of an aerial vehicle, according to example embodiments of the present disclosure. -
FIG. 6 is a flow diagram of an additional method of machine vision processing of sensor data of an aerial vehicle, according to example embodiments of the present disclosure. -
FIG. 7 is a block diagram of an example computing system that can be used to implement methods and systems according to example embodiments of the present disclosure. - Reference now will be made in detail to embodiments of the disclosed technology, one or more examples of which are illustrated in the drawings. Each example is provided by way of explanation of the disclosed technology, not limitation of the disclosed technology. In fact, it will be apparent to those skilled in the art that various modifications and variations can be made in the present disclosure without departing from the scope or spirit of the claims. For instance, features illustrated or described as part of example embodiments can be used with another embodiment to yield a still further embodiment. Thus, it is intended that the present disclosure covers such modifications and variations as come within the scope of the appended claims and their equivalents.
- As used in the specification and the appended claims, the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. The use of the term “about” in conjunction with a numerical value refers to within 25% of the stated amount.
- In example aspects, a machine vision controller for an aerial vehicle or aircraft is provided. In some embodiments, the machine vision controller may execute computer-readable program instructions for: processing sensor data, identifying geographic identifiers in sensor data, comparing obtained sensor data to desired sensor data, identifying geographic location from sensor data, automatically maneuvering the aerial vehicle responsive to sensor data and other data, determining if an approach vector is safe for landing, reducing risk associated with instrumentation-only landings, and/or other program instructions. The machine vision controller may be integrated within an aerial vehicle or may be a remote controller configured to transmit calculated data to the aerial vehicle. Additionally, the machine vision controller may also be configured to provide warnings, both audial and visual, to operators of the aerial vehicle, to aid in a plurality of aircraft maneuvers. Hereinafter, a detailed description of several example embodiments of aerial vehicles and machine vision controllers are provided in detail.
- Embodiments of the disclosed technology provide a number of technical benefits and advantages, particularly in the area of aircraft safety. As one example, the disclosed technology provides for safer landings by ensuring an aerial vehicle is approaching an appropriate runway or airport. The disclosed technology can also aid in automatic avoidance maneuvers through identifying geographic indicators. By identifying geographic indicators and matching geographic indicators to a map or model of an expected proximal area, the disclosed systems can avoid initiating a landing operation if an area is unexpected or if obstacles exist.
- Embodiments of the disclosed technology additionally provide a number of technical benefits and advantages in the area of computing technology. For example, the disclosed system can obtain sensor data associated with performance data of an aerial vehicle and automatically determine whether performance of an aircraft is affecting a landing approach. A computing system implemented in accordance with the disclosed technology can therefore avoid costly landing maneuvers in adverse conditions or implement safer landing procedures through assuring an actual approach vector is correct for a particular aerial vehicle.
-
FIG. 1 is a schematic illustration of an example landing area 2 for an aerial vehicle. Generally, the landing area 2 can be aligned with a runway 4. The runway 4 may include several identifying markers painted thereon, including several indicia. For example, the runway 4 can includerunway threshold markings 6, runway designation markings 8, runway aimingpoint markings 10, runwaytouchdown zone markings 12,runway centerline markings 14, runway side stripe markings 16,runway lighting 18, taxiway markings includingtaxiway centerline 20, taxiway edge marking 22,taxiway lighting 24, holdingposition markings 26, holdingposition sign 28, and holdingposition sign 30. The identifying markers and indicia may be processed by the machine vision controllers described herein, to aid in both taxying an aerial vehicle prior to takeoff and safely landing an aerial vehicle during/subsequent to a landing approach. - In some circumstances, for example during relatively poor weather or with limited visibility, an aerial vehicle may not have full visibility of some or all of the identifying markers outlined above. In these circumstances, it may be beneficial for an aerial vehicle to obtain sensor data, such as optical data, video data, photograph data, radar data, and/or Light Detection and Ranging (LIDAR) data. The sensor data can subsequently be processed to determine if an aerial vehicle can safely land or if an avoidance maneuver is appropriate. Additionally, under some circumstances, the sensor data can be compared to a map or model of a proximal area near the aircraft. Upon comparing, geographic indicators including buildings, landmarks, topographical features, runways, runway markings, and other geographic indicators can be identified to determine that the aerial vehicle is operating or attempting to land in a safe or expected area. Hereinafter, scenarios involving an aerial vehicle approaching the runway 4 are described in detail with reference to
FIG. 2A ,FIG. 2B ,FIG. 3A , andFIG. 3B . -
FIG. 2A is a schematic illustration of anaerial vehicle 200 approaching aportion 215 of the landing area ofFIG. 1 . For example,taxiway 40 andtaxiway 42 withgeographic position markings 44 including adirection sign 46 and alocation sign 48 are presented on theportion 215. Furthermore, theaerial vehicle 200 is approaching theportion 215 atapproach vector 202. - During the approach, the
aerial vehicle 200 may obtain a variety of sensor data.FIG. 2B is a schematic illustration ofexample sensor data 210 obtained by theaerial vehicle 200. As illustrated, thedirect approach vector 202 allows thesensor data 210 to appear relatively undistorted. In this example, a machine vision controller associated with theaerial vehicle 200 can process thesensor data 210 to determine that adirection indicator 46 andposition markings 44 are appropriately located relative to theaerial vehicle 200. In some implementations, the machine vision controller may generate one or more displays for a pilot or other operator providing thedirection indicator 46,position markings 44, and/or a representation of any other geographic indicator. - The machine vision controller may utilize various image processing and/or machine vision processing techniques to identify objects using image data. Physics-based modeling and/or machine learned models may be used for object detection. Objects may be detected and classified using various image and/or machine vision processing techniques.
- Accordingly, the
aerial vehicle 200 can safely continue its approach and complete landing maneuvers. However, if theaerial vehicle 200 and associated machine vision controller determine that theapproach vector 202 is inappropriate or misaligned, the machine vision controller associated with theaerial vehicle 200 can provide warnings, indications, and other data to a pilot or operator to ensure the pilot is aware of the situational position and approach of theaerial vehicle 200. - Under some circumstances, the warnings may be used to determine that a landing cannot continue due to obstructions, misaligned vectors, incorrect geographical areas, or other misinformation. Accordingly, the
aerial vehicle 200 may be equipped to efficiently correct the misinformation, alter a flight plan, and/or avoid a landing to ensure theaerial vehicle 200 lands at a correct geographic location, safely. Under other circumstances, typical machine vision processing of thesensor data 210 may be inappropriate due to a variety of issues, such as weather or approach vectors differing from the thrust vector of an aerial vehicle. - In some implementations, aircraft performance data may be used to selectively trigger alerts or displays that are generated based on the sensor data such as image data. By way of example, the machine vision controller may obtain altitude and/or aircraft configuration data from the performance data and use this information to selectively generate geographic based alerts. By way of example, the machine vision controller may determine from the performance data whether the aircraft is below a certain altitude and/or whether the aircraft landing gear is down. If the performance data indicates that the aircraft is attempting a landing, displays may be generated based on the geographic indicators. For example, runway markers and other identifiers may be displayed if the performance data indicates that the aircraft is landing. If, however, the performance data indicates that the aircraft is not attempting a landing, alerts or displays that would otherwise be generated in response to geographic indicators may not be displayed.
-
FIG. 3A is a schematic illustration ofaerial vehicle 200 approaching theportion 215 of the landing area ofFIG. 1 . As illustrated, theaerial vehicle 200 has athrust vector 302 in an attempt to correct forcrosswind 306 that is flowing perpendicular to theportion 215 of the landing area ofFIG. 1 . Thus, while thethrust vector 302 is generally directed away from thelanding area 215, theaerial vehicle 200 is actually on anapproach vector 304. Due to the direction of the aerial vehicle being out of perfect alignment with thelanding area 215, sensor data obtained by theaerial vehicle 200 may be distorted or difficult to process. - As an example,
FIG. 3B is a schematic illustration ofexample sensor data 310 obtained by theaerial vehicle 200 ofFIG. 3A onthrust vector 302. As shown, due to thethrust vector 302, and therefore the axial alignment of theaerial vehicle 200 not aligning with theportion 215 of the landing area, thesensor data 310 appears to point towards a different landing area as compared tosensor data 210. In this scenario, a machine vision controller associated with theaerial vehicle 200 can process thesensor data 310 to compensate for movement and/or performance data of theaerial vehicle 200. For example, the machine vision controller can determine acrosswind value 306 and performance data including thethrust vector 302. The machine vision controller can also estimate theapproach vector 304 based on thecrosswind value 306 and the performance data. Thereafter, the machine vision controller can translate thesensor data 310 into 320 based on theapproach vector 304. Therefore, the machine vision controller can correct the apparent misalignment or distortion based on thethrust vector 302,approach vector 304, andcrosswind 306, resulting insensor data 320.Sensor data 320 is sufficiently similar tosensor data 210 and allows for successful identification of theportion 215, including all visual markers illustrated thereon. Additionally, under some circumstances, thesensor data 320 can be compared to a map or model of a proximal area near theaerial vehicle 200. The map or model can include a two-dimensional mapping of information with associated height, depth, or other dimensional information, for example, of surrounding buildings and landmarks. Upon comparing, geographic indicators including buildings, landmarks, topographical features, runways, runway markings, and other geographic indicators can be identified to determine that theaerial vehicle 200 is operating or attempting to land in a safe or expected area. - As described above,
aerial vehicle 200 may include a machine vision controller configured to process sensor data, identify geographic identifiers in sensor data, compare obtained sensor data to desired sensor data, identify geographic location from sensor data, automatically maneuver the aerial vehicle responsive to sensor data and other data, determine if an approach vector is safe for landing, and/or reduce risk associated with instrumentation-only landings. Furthermore, under some circumstances, the machine vision controller can predict a flight path of theaerial vehicle 200 based on processing the sensor data. Flight path prediction may include an estimated or predicted flight path based on any of a thrust vector, crosswind, axial alignment based on a ground plane reference, instrumentation values, and other aerial vehicle performance data. The machine vision controller may be integrated within theaerial vehicle 200 or may be a remote controller configured to transmit calculated data to the aerial vehicle. The machine vision controller may also be configured to provide warnings, both audial and visual, to operators of the aerial vehicle, to aid in a plurality of aircraft maneuvers. Hereinafter, a detailed description of an exampleaerial vehicle 200 having an integratedmachine vision controller 402 is provided. -
FIG. 4 is a diagram of an exampleaerial vehicle 200, according to example embodiments of the present disclosure. As illustrated, theaerial vehicle 200 can include amachine vision controller 402 configured for sensor data processing as described herein. Themachine vision controller 402 can be a standalone controller, customized controller, or can be a generic computer controller configured to process sensor data as a machine vision processor. - The
controller 402 can be configured to obtain sensor data from afirst sensor array 404. Thesensor array 404 may be an externally mounted sensor array or an internal mounted sensor array. Thesensor array 402 can include one or more of optical sensors, camera sensors, acoustic sensors, radar sensors, laser sensors, and/or LIDAR sensors. Optical sensors can include visible light sensors, infrared sensors, and other optical sensors tuned to more or fewer frequencies. For example, optical sensors that can sense and encode infrared frequencies may have better performance characteristics in low-light and inclement weather conditions as compared to typical human sight. Other appropriate sensors may also be included in thesensor array 404, according to any desired implementation of theaerial vehicle 200. - The
controller 402 can also be configured to obtain sensor data from asecond sensor array 406. Thesensor array 406 may be an externally mounted sensor array or an internal mounted sensor array. Thesensor array 406 can include one or more of optical sensors, camera sensors, acoustic sensors, radar sensors, laser sensors, and/or LIDAR sensors. Optical sensors can include visible light sensors, infrared sensors, and other optical sensors tuned to more or fewer frequencies. Other appropriate sensors may also be included in thesensor array 406, according to any desired implementation of theaerial vehicle 200. - The
controller 402 may be configured to communicate with external processors, controllers, and/or ground equipment and other aircraft throughinterface 408. Theinterface 408 may be a standardized communications interface configured to send and receive data viaantenna array 410. It is noted that although particularly illustrated as an externally mounted antenna array on the upper surface of theaerial vehicle 200, that any form of antenna array may be applicable. - The
controller 402 may also be configured to communicate withavionics system 412 of theaerial vehicle 200 overcommunications bus 450. For example, thecontroller 402 may provide appropriate information toavionics system 412 such thatlanding gear 422 is controlled up/down based on machine vision processing. Furthermore, thecontroller 402 may provide appropriate information toavionics system 412 such that thrust fromengines 424 is controlled based on machine vision processing. Moreover, thecontroller 402 may provide appropriate information toavionics system 412 such thatcontrol surfaces 426 are adjusted for automatic maneuvers such as landing denials, obstacle avoidance, safety maneuvers, and other movement of theaerial vehicle 200. - It should be readily understood that the
aerial vehicle 200 may include more or fewer control components than those particularly illustrated. Furthermore, theaerial vehicle 200 may include several components, aspects, and necessary structures not particularly illustrated herein for the sake of brevity, clarity, and concise disclosure of the example embodiments described herein. Hereinafter, operational details of theaerial vehicle 200 are presented with reference toFIG. 5 andFIG. 6 . - The controller and avionics system may generally include one or more processor(s) and associated memory configured to perform a variety of computer-implemented functions, such as various methods, steps, calculations and the like disclosed herein. In some examples, the controller and/or avionics system may be programmable logic devices, such as a Field Programmable Gate Array (FPGA), however they may be implemented using any suitable hardware and/or software.
- The term processor may generally refer to integrated circuits, and may also refer to a controller, microcontroller, a microcomputer, a programmable logic controller (PLC), an application specific integrated circuit (ASIC), a Field Programmable Gate Array (FPGA), and other programmable circuits. Additionally, the memory described herein may generally include memory element(s) including, but not limited to, computer readable medium (e.g., random access memory (RAM)), computer readable non-volatile medium (e.g., flash memory), a compact disc-read only memory (CD-ROM), a magneto-optical disk (MOD), a digital versatile disc (DVD) and/or other suitable memory elements or combinations thereof.
- Any one or a combination of the flight management system and vehicle control system may also include a communications interface. The communications interface can include associated electronic circuitry that is used to send and receive data. More specifically, the communications interface can be used to send and receive data between any of the various systems. Similarly, a communications interface at any one of the systems may be used to communicate with outside components such as another aerial vehicle and/or ground control. A communications interface may be any combination of suitable wired or wireless communications interfaces.
-
FIG. 5 is a flow diagram of amethod 500 of machine vision processing of sensor data of an aerial vehicle, andFIG. 6 is a flow diagram of amethod 600 of machine vision processing of sensor data of an aerial vehicle, according to example embodiments of the present subject matter. It should be understood that the operations of the methods disclosed herein are not necessarily presented in any particular order and that performance of some or all of the operations in an alternative order(s) is possible and is contemplated. The operations have been presented in the demonstrated order for ease of description and illustration. Operations may be added, omitted, and/or performed simultaneously, without departing from the scope of the appended claims. - For example, the operations of the
methods controller 402, which can comprise a processor, control application, machine vision circuitry, machine vision components and/or machine vision applications. In some configurations, the system components include functionality produced by an application programming interface (API), a compiled program, an interpreted program, a network service, a script, or any other executable set of instructions. - Although the following illustration refers to the components of
FIG. 4 , it can be appreciated that the operations of themethods methods methods aerial vehicle 200. Any service, circuit or application suitable for providing the techniques disclosed herein can be used in operations described herein, either remote to, onboard, or a combination thereof with respect to theaerial vehicle 200. - As shown in
FIG. 5 , themethod 500 includes obtaining sensor data from asensor array block 502. The sensor data can include, for example, optical data or radar data obtained from at least one sensor of thesensor arrays sensor arrays - The
method 500 further includes obtaining performance data fromavionics 412 and/orsensor array block 504. The performance data can include, for example, a calculated approach vector, a thrust vector, rate of ascent, rate of descent, angle relative to a ground plane, and other suitable performance data. In at least one example, the performance data is obtained from sensors and/oravionics system 412. The performance data may be obtained directly from sensors and/oravionics system 412 or may be derived from sensor data obtained from the sensors. According to other examples, the performance data is input manually. Still according to other examples, the performance data may be based on positions of control surfaces and engine status provided byavionics 412 or other components of theaerial vehicle 200. - The
method 500 further includes processing the sensor data based on the performance data, atblock 506. In some examples, block 506 may include processing the sensor data to compensate for the performance data. In some examples, the sensor data is processed with a machine vision controller using machine vision and/or image processing techniques. In some examples, the sensor data is processed to compensate for movement of the aerial vehicle as determined from the performance data. According to at least one example, the processing can include processing a distorted image using machine vision to create an image with reduced, minimized, or corrected distortion. The corrected image may include legible indicia and other corrections. The processing can also include processing to compensate for movement, such as by, for example, correcting perspectives of buildings, landmarks, and other features. Additionally, the processing can include isolating landmarks, indicia, geographic features, and other optical data for further identification in subsequent or parallel machine vision processing. For example, and as illustrated inFIG. 3B , thesensor data 310 can be processed to create correctedsensor data 320. Themachine vision controller 402 may process thesensor data 310 to correct apparent distortion, misalignments, and other issues using the performance data and any other available data. - The
method 500 further includes identifying a geographic indicator in the processed sensor data, atblock 508. For example, themachine vision controller 402 can interpret the processed sensor data to identify visual indicators such as those illustrated inFIG. 1 . Other geographic indicators can include, for example, a number of buildings surrounding an landing area, height of buildings surrounding a landing area, surrounding higher elevation (such as mountains or natural features), surrounding lower elevation (such as crevasses or natural features), and other geographic indicators. Still further, geographic indicators can include GPS data, rivers, bodies of water, physical or virtual beacons, temporary structures such as towers/cones, and other indicators. - The
method 500 further includes determining a geographic location of the aerial vehicle based on the identified geographic indicator, atblock 510. Themachine vision controller 402 can compare the identified geographic indicators to a predetermined set of geographic indicators, such as expected geographic features or a map or model of expected features, to determine a location of theaerial vehicle 200. For example, themachine vision controller 402 can determine if an expected control tower at an airport is detectable in the sensor data. Themachine vision controller 402 can also determine if airport buildings are detectable in the sensor data. Thus, using all available sensor data, themachine vision controller 402 can compare the sensor data to expected data from a flight plan to determine the geographic location. - In some example aspects, a model of expected features may include depth or height information. Accordingly, in some examples the
machine vision controller 402 is configured to determine a height of geographic features in a proximal location to the aerial vehicle based on processing the sensor data. For example, themachine vision controller 402 can access one or more images obtained by one or more image sensors to identify one or more geographic features. A depth camera, radar/LIDAR, and/or other sensors may be used to determine three-dimensional or height information associated with the geographic features. Themachine vision controller 402 can access a database of predetermined geographic indicators associated with a geographic location such as known landing locations. The database can include height or other depth information associated with the predetermined geographic indicators. Themachine vision controller 402 can compare the database, including the height of geographic indicators, in order to identify geographic features based on the image data. Themachine vision controller 402 can determine a corresponding geographic location to a geographic feature based on comparing the height of geographic features from the sensor data to height information in the database. - Utilizing the determined geographic location, a pilot or operator of the
aerial vehicle 200 can decide whether to safely land the aerial vehicle, alter a flight plan, or otherwise control the aerial vehicle. It is contemplated that visual indicators such as warnings and audio indicators such as alarms can also be used. For example, an indicator such as a visual indicator that landing can be performed safely or not safely can be provided. In some implementations, the indicator may identify a runway or other feature with which the aerial vehicle is aligned for landing. In some implementations, an indicator may include a warning that the aerial vehicle is misaligned. For example, the system may compare the geographic indicator with flight plan data and provide a warning if the geographic indicator is not associated with the flight plan data. In some examples, a warning can be provided if geographic indicators include obstructions identified in machine vision processing. Furthermore, visual and audio indicators can include positive indications that a landing pattern is deemed safe by themachine vision controller 402 such that the pilot or operator can concentrate on other maneuvers to complete a safe landing. - Additionally, the pilot or operator of the
aerial vehicle 200 may be remote, and the aerial vehicle may be an unmanned aerial vehicle, such that alerts, indicators, and geographic location are provided to the pilot or operator in a remote location. In these circumstances, the identified geographic location may be used to automatically maneuver the aircraft, as described below with reference toFIG. 6 . It is noted, however, that a manned aircraft may be automatically maneuvered in accordance with embodiments of the disclosed technology.FIG. 6 is a flow diagram of anadditional method 600 of machine vision processing of sensor data of an aerial vehicle, according to example embodiments of the present disclosure. - As shown in
FIG. 6 , themethod 600 includes obtaining sensor data from asensor array block 602. The sensor data may be substantially similar to the sensor data described above. Themethod 600 further includes processing the obtained sensor data to identify geographic features in the area proximal to the aerial vehicle, atblock 604. The processing may include correcting distortion, compensating for movement, identifying obstacles, and/or matching flight plans to expected geographic features. - According to at least one example, the processing can include processing a distorted image, or distorted sensor data, using machine vision to create a processed image or processed sensor data with reduced, minimized, or corrected distortion. The processed image may include legible indicia and other corrections. The processing can also include processing to compensate for movement, such as by, for example, correcting perspectives of buildings, landmarks, and other features. Additionally, the processing can include isolating landmarks, indicia, geographic features, and other optical data for further identification in subsequent or parallel machine vision processing.
- In some examples, processing the sensor data includes processing image data based on aircraft misalignment. For example, performance data such as crosswind and other information can be used to determine a misalignment of an aircraft approach vector with the image data obtained from one or more sensors. The performance data can be used to translate or otherwise determine modified sensor data to correct or compensate for the misalignment. For example, when approaching a landing area the image data may identify a first runway according to a first runway marker for example. However, the performance data may indicate that the aerial vehicle is landing in a strong crosswind. This and/or other performance data may indicate that the aerial vehicle is misaligned with its approach vector. This information may be used to modify the sensor data to indicate the appropriate information corresponding with the approach. As a result, the
machine vision controller 402 may determine that the aircraft is actually approaching a second runway adjacent to the first runway. The determination may be made even though the image data corresponds to the first runway. - The
method 600 further includes identifying a geographic indicator in the processed sensor data, atblock 606. The geographic indicator may include a desired indicator or an obstacle. For example, the geographic indicator may include a landmark, body of water, airport, particular runway at an airport, or other geographic indicator, including all geographic indicators described herein. Using machine vision processing and/or a machine vision algorithm, the geographic indicator can be identified in the processed sensor data. Additionally, the machine vision controller can identify obstacles atblock 606. For example, an obstacle can include any object or indicia that is not expected in a landing area. Obstacles can include temporary or permanent obstacles. Obstacles can further include indicia purposefully marked (e.g., such as an ‘X’ or ‘NOT SAFE FOR LANDING’) to allow pilots and operators to identify that a landing area is not to be used for landings. In some examples, themachine vision controller 402 can immediately indicate the presence of an obstacle. In some examples, themachine vision controller 402 can delay an alert if an obstacle is a moving obstacle, such as a landing assist vehicle, that will be clear of an active landing area prior to landing by the aerial vehicle. Furthermore, themachine vision controller 402 can issues multiple forms of soft alerts and high alerts if moving obstacles are deemed hazardous or are not moving as expected, or are not following an expected pattern. Processing and identifying obstacles can be aided through use of radar/LIDAR information, and other suitable data available from sensors. - The
method 600 further includes determining if the identified geographic indicator is in a flight plan associated with the aerial vehicle, atblock 608. For example, themachine vision controller 402 may process the sensor data and compare the same to expected sensor data of the flight plan. The expected sensor data may include maps, models, renderings, and/or other suitable data for machine vision processing and comparison to the processed sensor data. The expected sensor data can include, for example, a two-dimensional map having associated height or dimensional information. Machine vision processing can facilitate comparisons between the processed sensor data and this expected sensor data to determine if a match exists or if the geographic indicator exists within the flight plan. - The
method 600 further includes automatically maneuvering the aerial vehicle, or providing visual indication, based on the determination, atblock 610. For example, the aerial vehicle, based on the determination, may be automatically maneuvered to avoid a landing or continue flying. Other automatic maneuvers including obstacle avoidance, fast touchdown and takeoff, and similar maneuvers based on geographical indicators are also applicable. Themachine vision controller 402 can also provide video or audio indications, as described above. - It should be appreciated that the operational blocks of
method 500 andmethod 600 may not exhaustively describe all aspects of aerial vehicle control. These operational blocks may be a simplified operational flow chart describing only partial aspects of aerial vehicle control, and should not be construed of illustrating all possible control scenarios. -
FIG. 7 depicts a block diagram of anexample computing system 1000 that can be used to implement methods and systems according to example embodiments of the present disclosure.Computing system 1000 may be used to implement amachine vision controller 402 as described herein. It will be appreciated, however, thatcomputing system 1000 is one example of a suitable computing system for implementing themachine vision controller 402 and other computing elements described herein. - As shown, the
computing system 1000 can include one or more computing device(s) 1002. The one or more computing device(s) 1002 can include one or more processor(s) 1004 and one or more memory device(s) 1006. The one or more processor(s) 1004 can include any suitable processing device, such as a microprocessor, microcontroller, integrated circuit, logic device, or other suitable processing device. The one or more memory device(s) 1006 can include one or more computer-readable media, including, but not limited to, non-transitory computer-readable media, RAM, ROM, hard drives, flash drives, or other memory devices. - The one or more memory device(s) 1006 can store information accessible by the one or more processor(s) 1004, including computer-
readable instructions 1008 that can be executed by the one or more processor(s) 1004. Theinstructions 1008 can be any set of instructions that when executed by the one or more processor(s) 1004, cause the one or more processor(s) 1004 to perform operations. Theinstructions 1008 can be software written in any suitable programming language or can be implemented in hardware. In some embodiments, theinstructions 1008 can be executed by the one or more processor(s) 1004 to cause the one or more processor(s) 1004 to perform operations, such as the operations for machine vision processing of sensor data, identifying geographic features and indicators, locating an aerial vehicle based on sensor data, and other operations such as those described above with reference tomethods - The memory device(s) 1006 can further store
data 1010 that can be accessed by theprocessors 1004. For example, thedata 1010 can include sensor data such as engine parameters, model data, logic data, etc., as described herein. Thedata 1010 can include one or more table(s), function(s), algorithm(s), model(s), equation(s), etc. according to example embodiments of the present disclosure. - The one or more computing device(s) 1002 can also include a
communication interface 1012 used to communicate, for example, with the other components of system. Thecommunication interface 1012 can include any suitable components for interfacing with one or more network(s), including for example, transmitters, receivers, ports, controllers, antennas, or other suitable components. - The technology discussed herein makes reference to computer-based systems and actions taken by and information sent to and from computer-based systems. One of ordinary skill in the art will recognize that the inherent flexibility of computer-based systems allows for a great variety of possible configurations, combinations, and divisions of tasks and functionality between and among components. For instance, processes discussed herein can be implemented using a single computing device or multiple computing devices working in combination. Databases, memory, instructions, and applications can be implemented on a single system or distributed across multiple systems. Distributed components can operate sequentially or in parallel.
- Although specific features of various embodiments may be shown in some drawings and not in others, this is for convenience only. In accordance with the principles of the present disclosure, any feature of a drawing may be referenced and/or claimed in combination with any feature of any other drawing.
- This written description uses examples to disclose the claimed subject matter, including the best mode, and also to enable any person skilled in the art to practice the claimed subject matter, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the disclosed technology is defined by the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they include structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal languages of the claims.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/939,521 US20230002048A1 (en) | 2018-01-29 | 2022-09-07 | Aerial vehicles with machine vision |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1801416.7A GB2570497B (en) | 2018-01-29 | 2018-01-29 | Aerial vehicles with machine vision |
GB1801416 | 2018-01-29 | ||
GB1801416.7 | 2018-01-29 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/939,521 Division US20230002048A1 (en) | 2018-01-29 | 2022-09-07 | Aerial vehicles with machine vision |
Publications (2)
Publication Number | Publication Date |
---|---|
US20190233101A1 true US20190233101A1 (en) | 2019-08-01 |
US11440657B2 US11440657B2 (en) | 2022-09-13 |
Family
ID=61558237
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/240,913 Active 2040-06-08 US11440657B2 (en) | 2018-01-29 | 2019-01-07 | Aerial vehicles with machine vision |
US17/939,521 Pending US20230002048A1 (en) | 2018-01-29 | 2022-09-07 | Aerial vehicles with machine vision |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/939,521 Pending US20230002048A1 (en) | 2018-01-29 | 2022-09-07 | Aerial vehicles with machine vision |
Country Status (3)
Country | Link |
---|---|
US (2) | US11440657B2 (en) |
FR (1) | FR3077393B1 (en) |
GB (1) | GB2570497B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220097713A1 (en) * | 2020-09-28 | 2022-03-31 | Ford Global Technologies, Llc | Crosswind risk determination |
US11355022B2 (en) * | 2019-09-13 | 2022-06-07 | Honeywell International Inc. | Systems and methods for computing flight controls for vehicle landing |
US11392118B1 (en) * | 2021-07-23 | 2022-07-19 | Beta Air, Llc | System for monitoring the landing zone of an electric vertical takeoff and landing aircraft |
US20230001922A1 (en) * | 2021-07-01 | 2023-01-05 | Triple Win Technology(Shenzhen) Co.Ltd. | System providing blind spot safety warning to driver, method, and vehicle with system |
US11760348B2 (en) | 2021-09-27 | 2023-09-19 | Ford Global Technologies, Llc | Vehicle boundary control |
Family Cites Families (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2835314B1 (en) | 2002-01-25 | 2004-04-30 | Airbus France | METHOD FOR GUIDING AN AIRCRAFT IN THE FINAL LANDING PHASE AND CORRESPONDING DEVICE |
US7908078B2 (en) | 2005-10-13 | 2011-03-15 | Honeywell International Inc. | Perspective-view visual runway awareness and advisory display |
US20090306840A1 (en) * | 2008-04-08 | 2009-12-10 | Blenkhorn Kevin P | Vision-based automated landing system for unmanned aerial vehicles |
US8914166B2 (en) | 2010-08-03 | 2014-12-16 | Honeywell International Inc. | Enhanced flight vision system for enhancing approach runway signatures |
US8868265B2 (en) * | 2011-11-30 | 2014-10-21 | Honeywell International Inc. | System and method for aligning aircraft and runway headings during takeoff roll |
FR2996671B1 (en) | 2012-10-05 | 2014-12-26 | Dassault Aviat | VISUALIZATION SYSTEM FOR AN AIRCRAFT IN APPROACH TO A LANDING TRACK AND VISUALIZATION METHOD THEREOF |
US8880328B2 (en) * | 2012-11-02 | 2014-11-04 | Ge Aviation Systems Llc | Method of optically locating an aircraft relative to an airport |
CN103424126B (en) | 2013-08-12 | 2016-02-24 | 西安电子科技大学 | A kind of unmanned plane vision independent landing simulation checking system and method |
MX2016008890A (en) * | 2014-01-10 | 2017-01-16 | Pictometry Int Corp | Unmanned aircraft structure evaluation system and method. |
US9772712B2 (en) | 2014-03-11 | 2017-09-26 | Textron Innovations, Inc. | Touch screen instrument panel |
WO2016210305A1 (en) * | 2015-06-26 | 2016-12-29 | Mobile Video Corporation | Mobile camera and system with automated functions and operational modes |
EP4001111A3 (en) * | 2015-11-10 | 2022-08-17 | Matternet, Inc. | Methods and system for transportation using unmanned aerial vehicles |
US20170146990A1 (en) | 2015-11-19 | 2017-05-25 | Caterpillar Inc. | Augmented communication and positioning using unmanned aerial vehicles |
WO2017090040A1 (en) * | 2015-11-23 | 2017-06-01 | Almog Rescue Systems Ltd. | SYSTEM AND METHOD FOR PAYLOAD DISPERSION USING UAVs |
US10657827B2 (en) * | 2015-12-09 | 2020-05-19 | Dronesense Llc | Drone flight operations |
CN205594459U (en) | 2015-12-11 | 2016-09-21 | 国网四川省电力公司电力应急中心 | Unmanned aerial vehicle is fixing a position system of falling based on machine vision |
US10121383B2 (en) * | 2016-01-26 | 2018-11-06 | Northrop Grumman Systems Corporation | Terrain profile system |
US9936191B2 (en) | 2016-01-27 | 2018-04-03 | Honeywell International Inc. | Cockpit display systems and methods for generating cockpit displays including enhanced flight visibility indicators |
US9643736B1 (en) | 2016-04-14 | 2017-05-09 | Goodrich Corporation | Systems and methods for landing lights |
JP6654977B2 (en) * | 2016-07-04 | 2020-02-26 | 株式会社Soken | Own vehicle position specifying device and own vehicle position specifying method |
WO2018015959A1 (en) * | 2016-07-21 | 2018-01-25 | Vision Cortex Ltd. | Systems and methods for automated landing of a drone |
US10248124B2 (en) * | 2016-07-21 | 2019-04-02 | Mobileye Vision Technologies, Inc. | Localizing vehicle navigation using lane measurements |
JP2019008618A (en) * | 2017-06-26 | 2019-01-17 | パナソニックIpマネジメント株式会社 | Information processing apparatus, information processing method, and program |
US10739161B2 (en) * | 2018-04-23 | 2020-08-11 | Honeywell International Inc. | Runway landing alert system and method |
-
2018
- 2018-01-29 GB GB1801416.7A patent/GB2570497B/en active Active
-
2019
- 2019-01-07 US US16/240,913 patent/US11440657B2/en active Active
- 2019-01-29 FR FR1900778A patent/FR3077393B1/en active Active
-
2022
- 2022-09-07 US US17/939,521 patent/US20230002048A1/en active Pending
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11355022B2 (en) * | 2019-09-13 | 2022-06-07 | Honeywell International Inc. | Systems and methods for computing flight controls for vehicle landing |
US11900823B2 (en) | 2019-09-13 | 2024-02-13 | Honeywell International Inc. | Systems and methods for computing flight controls for vehicle landing |
US20220097713A1 (en) * | 2020-09-28 | 2022-03-31 | Ford Global Technologies, Llc | Crosswind risk determination |
US11400940B2 (en) * | 2020-09-28 | 2022-08-02 | Ford Global Technologies, Llc | Crosswind risk determination |
US20230001922A1 (en) * | 2021-07-01 | 2023-01-05 | Triple Win Technology(Shenzhen) Co.Ltd. | System providing blind spot safety warning to driver, method, and vehicle with system |
US11392118B1 (en) * | 2021-07-23 | 2022-07-19 | Beta Air, Llc | System for monitoring the landing zone of an electric vertical takeoff and landing aircraft |
US11760348B2 (en) | 2021-09-27 | 2023-09-19 | Ford Global Technologies, Llc | Vehicle boundary control |
Also Published As
Publication number | Publication date |
---|---|
FR3077393B1 (en) | 2022-06-24 |
GB2570497B (en) | 2020-07-29 |
GB201801416D0 (en) | 2018-03-14 |
US11440657B2 (en) | 2022-09-13 |
US20230002048A1 (en) | 2023-01-05 |
FR3077393A1 (en) | 2019-08-02 |
GB2570497A (en) | 2019-07-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11440657B2 (en) | Aerial vehicles with machine vision | |
US9851724B2 (en) | Automatic take-off and landing control device | |
US9092975B2 (en) | Aircraft systems and methods for displaying visual segment information | |
US10013885B2 (en) | Airspace deconfliction system and method | |
US9478140B2 (en) | System and method for displaying traffic and associated alerts on a three-dimensional airport moving map display | |
US7269513B2 (en) | Ground-based sense-and-avoid display system (SAVDS) for unmanned aerial vehicles | |
EP3125213B1 (en) | Onboard aircraft systems and methods to identify moving landing platforms | |
EP2837565B1 (en) | Aircraft systems and methods for displaying runway lighting information | |
US7937191B2 (en) | Termination secured route planning | |
EP3211376B1 (en) | System and method for detecting misaligned stationary objects | |
US11915603B2 (en) | Docking guidance display methods and systems | |
US9558674B2 (en) | Aircraft systems and methods to display enhanced runway lighting | |
EP2919219B1 (en) | System and method for identifying runway position during an intersection takeoff | |
US20160307450A1 (en) | Aircraft systems and methods to display moving landing platforms | |
CN105321377A (en) | System and method for automatically identifying displayed ATC mentioned traffic | |
CN104807456A (en) | Method for automatic return flight without GPS (global positioning system) signal | |
US11600185B2 (en) | Systems and methods for flight planning for conducting surveys by autonomous aerial vehicles | |
CN114379802A (en) | Automatic safe landing place selection for unmanned flight system | |
US20220157182A1 (en) | System and Method For Flight and Landing Navigation for Unpiloted Vertical and Take-Off Landing (UVTOL) Aircraft | |
EP3702871B1 (en) | Design and processing of multispectral sensors for autonomous flight | |
US20230023069A1 (en) | Vision-based landing system | |
EP3926608A2 (en) | Docking guidance display methods and systems | |
Kartikeyan et al. | A Vision-Based Parameter Estimation for an Aircraft in Approach Phase |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GE AVIATION SYSTEMS LIMITED, GREAT BRITAIN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SCHWINDT, STEFAN ALEXANDER;REEL/FRAME:047915/0871 Effective date: 20180105 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |