EP3866139A1 - Collision awareness using historical data for vehicles - Google Patents

Collision awareness using historical data for vehicles Download PDF

Info

Publication number
EP3866139A1
EP3866139A1 EP21154155.2A EP21154155A EP3866139A1 EP 3866139 A1 EP3866139 A1 EP 3866139A1 EP 21154155 A EP21154155 A EP 21154155A EP 3866139 A1 EP3866139 A1 EP 3866139A1
Authority
EP
European Patent Office
Prior art keywords
vehicle
predicted
processing circuitry
predicted path
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP21154155.2A
Other languages
German (de)
French (fr)
Inventor
Vipul Gupta
Lawrence J. Surace
Sujaya Rao
Rajesh Chenchu
Siva Prasad KOLLI
Vinod Prakash SINGH
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honeywell International Inc
Original Assignee
Honeywell International Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/070,830 external-priority patent/US11854418B2/en
Application filed by Honeywell International Inc filed Critical Honeywell International Inc
Publication of EP3866139A1 publication Critical patent/EP3866139A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0017Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information
    • G08G5/0021Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information located in the aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0073Surveillance aids
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0004Transmission of traffic-related information to or from an aircraft
    • G08G5/0013Transmission of traffic-related information to or from an aircraft with a ground station
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/04Anti-collision systems
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/04Anti-collision systems
    • G08G5/045Navigation or guidance aids, e.g. determination of anti-collision manoeuvers
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/06Traffic control systems for aircraft, e.g. air-traffic control [ATC] for control when on the ground
    • G08G5/065Navigation or guidance aids, e.g. for taxiing or rolling

Definitions

  • This disclosure relates to collision awareness for vehicles.
  • collision avoidance systems have been implemented that help prevent potential collisions of an aircraft with another aircraft using various sensors, imaging devices, radar, and other hardware components installed on an aircraft.
  • Such hardware components increase the weight, maintenance complexity, and in general, overall cost, for such vehicles.
  • An increasing amount of air traffic has also involved very large airliners with very long wingspans, which may sometimes reduce wingtip clearance margins while the aircraft is in motion on airport ground surfaces.
  • multiple aircraft in an area may be powered down at any given time or otherwise, not emitting a tracking beacon that may be used to reduce the likelihood of collision with another vehicle.
  • An aircraft may be powered down and pulled under tug by a tug vehicle, in which case, aircraft ground collisions or collisions between aircraft and other vehicles may be even more likely to occur.
  • wingtip collisions may occur at an even higher rate due to the seemingly unrestricted route an aircraft or tug vehicle can take to reach an intended destination.
  • a vehicle may transmit a current location of the vehicle to a user interface (UI) device (e.g., an electronic flight bag (EFB)) or to a remote data server (e.g., a cloud-based data server).
  • UI user interface
  • EFB electronic flight bag
  • the remote data server or EFB may predict the potential collision zone using one or more of the historical navigation route data, clearance information for one or more vehicles, and/or aerodrome guidance features and provide an indication of the potential collision zone to a user.
  • the historical navigation route data may be based on transponder positional data and stored in a database of historical vehicle data.
  • the aerodrome guidance features may include data stored in a database that provides information as to the location of guidance markings, such as guidance lines painted on a surface, guidance signs, building features, and other information that provide guidance to vehicles throughout a particular aerodrome location.
  • a collision awareness system may predict routes using the historical navigation route data and aerodrome guidance features and predict vehicle positions along the route to determine potential collision zones.
  • the collision awareness system may provide the potential collision zone data for display on an EFB, such as on an airport moving map display (AMMD) application executing on the EFB.
  • AMMD airport moving map display
  • surface vehicle tracking systems may be used to determine airport surface transient object data using, for example, multilateration sensors or other airport system sensors. This data may be used to confirm or verify the prediction of the collision awareness system predicted potential collision zones using one or more of the historical navigation route data, clearance information for one or more vehicles, and/or aerodrome guidance features.
  • Some transient ground objects or types of transient ground objects may not or may not be actively transmitting messages or signals that may be received by certain types of multilateration sensors or other airport system sensors, or may not respond to certain types of interrogation signals transmitted by multilateration sensors or other airport system sensors, such as if the airport system sensors use cooperative surveillance with which objects other than aircraft are not typically configured to cooperate.
  • a transient aircraft may be pulled via an aircraft tug (e.g., a tug vehicle that transports other vehicles).
  • the aircraft being pulled may be powered down at the time, such that the aircraft does not transmit signals that may be used to track the vehicle location.
  • the vehicles may be in areas of an aerodrome that provide less guidance to vehicles via aerodrome guidance features.
  • an apron area of an airport may not include painted guidance features on the surface that may be referenced in an aerodrome guidance database.
  • complex maneuvering and high traffic areas in various aerodrome locations increases the likelihood of potential vehicle collision (e.g., wingtip collisions, etc.).
  • a collision awareness system may utilize one or more of the historical navigation route data and/or aerodrome guidance features to predict potential collision zones between vehicles.
  • the collision awareness system may utilize vehicle clearance information, such as clearance information from an air traffic controller (ATC), to predict potential collision zones between vehicles traversing a surface, where at least one vehicle is moving, either by tug or not.
  • ATC air traffic controller
  • the collision awareness system may execute on a remote server that collects data, such as positions of vehicles, updates databases, and predict collision zones.
  • the collision awareness system may execute at least in part on an EFB or other user interface device.
  • the collision awareness system may receive clearance information in the form of text or voice, process the clearance information, and determine navigational information for a vehicle or predict, based on the clearance information, a current position of a vehicle. For example, if a vehicle receives clearance information to a particular gate of an apron area, but then powers down the avionics system of the vehicle, the collision awareness system may determine how much time has passed since the vehicle received the clearance information, how much time historically a vehicle would take to arrive at a destination point or another target mark on a path toward the destination point, and predict a location of the vehicle at any particular point in time.
  • collision awareness system may use the historical navigation route data and the aerodrome guidance features to predict a location of the vehicle, predict a trajectory of the vehicle and predict trajectories of other vehicles to determine whether an overlap between an envelope of two or more vehicles indicates a potential collision at a prospective or future time.
  • a collision awareness system may be implemented without requiring additional hardware installation on an aircraft and may provide reliable indications as to predicted collision zones in an aerodrome by leveraging particular computing systems to overlay data, such as aerodrome surface data overlaid with historical navigation route data.
  • the collision awareness system may leverage machine learning models to provide such predictions trained on particular data inputs that allow continuous modeling and updating of predicted routes as a vehicle traverses the route.
  • a collision awareness system may predict a route of a first vehicle, but as the first vehicle starts traveling the predicted route, may determine an updated predicted route of the first vehicle, such as based on data received from the vehicle (e.g., speed information, position information, etc.), thereby allowing the collision awareness system to provide dynamic predictions on the fly as objects are moving throughout the aerodrome and as historical navigation route data evolves with ever changing conditions.
  • the collision awareness system may predict collision zones based on aircraft specifics and aerodrome specifics while referencing both general and specific information derived from multiple vehicle types and aerodrome locations.
  • a method includes obtaining, by processing circuitry of a ground collision awareness system, historical navigation route data for one or more reference vehicles, the historical navigation route data being based on transponder positional data.
  • the method further includes identifying, by the processing circuitry, a plurality of aerodrome guidance features for a particular aerodrome location, the aerodrome guidance features including guidance marker information.
  • the method further includes determining, by the processing circuitry, a predicted path of a first vehicle, the predicted path comprising a first portion and a second portion, the first portion of the predicted path being predicted using the guidance marker information, and the second portion of the predicted path being predicted using the historical navigation route data.
  • the method further includes determining, by the processing circuitry, a predicted position of the first vehicle along the predicted path at a prospective time.
  • the method further includes determining, by the processing circuitry, a predicted position of a second vehicle with respect to approximately the same prospective time.
  • the method further includes performing, by the processing circuitry, a comparison of a first vehicle envelope for the first vehicle and a second vehicle envelope for the second vehicle at the predicted positions.
  • the method further includes identifying, by the processing circuitry, an overlap of the first vehicle envelope and the second vehicle envelope.
  • the method further includes and determining, by the processing circuitry, a predicted collision zone of the first vehicle and the second vehicle at the prospective time based at least in part on the overlap of the first vehicle envelope and the second vehicle envelope.
  • a ground collision awareness system comprising a processor and a memory.
  • the memory is configured to store: historical navigation route data for one or more reference vehicles, wherein the historical navigation route data are based on transponder positional data, and a plurality of aerodrome guidance features for one or more aerodrome locations, wherein the aerodrome guidance features include guidance marker information.
  • the processor of the ground collision awareness system is configured to determine a predicted path of a first vehicle, the predicted path comprising a first portion and a second portion, the first portion of the predicted path being predicted using the guidance marker information, and the second portion of the predicted path being predicted using the historical navigation route data; determine a predicted position of the first vehicle along the predicted path at a prospective time; determine a predicted position of a second vehicle with respect to approximately the same prospective time; perform a comparison of a first vehicle envelope for the first vehicle and a second vehicle envelope for the second vehicle at the predicted positions; identify an overlap of the first vehicle envelope and the second vehicle envelope; and determine a predicted collision zone of the first vehicle and the second vehicle at the prospective time based at least in part on the overlap of the first vehicle envelope and the second vehicle envelope.
  • a non-transitory computer-readable storage medium having stored thereon instructions having stored thereon instructions.
  • the instructions when executed, cause one or more processors to: obtain historical navigation route data for one or more reference vehicles, the historical navigation route data being based on transponder positional data; identify a plurality of aerodrome guidance features for a particular aerodrome location, the aerodrome guidance features including guidance marker information; determine a predicted path of a first vehicle, the predicted path comprising a first portion and a second portion, the first portion of the predicted path being predicted using the guidance marker information, and the second portion of the predicted path being predicted using the historical navigation route data; determine a predicted position of the first vehicle along the predicted path at a prospective time; determine a predicted position of a second vehicle with respect to approximately the same prospective time; perform a comparison of a first vehicle envelope for the first vehicle and a second vehicle envelope for the second vehicle at the predicted positions; identify an overlap of the first vehicle envelope and the second vehicle envelope; and determine a predicted collision zone of the first vehicle and the second vehicle at the prospective
  • the disclosure is also directed to an article of manufacture comprising a computer-readable storage medium.
  • the computer-readable storage medium comprises computer-readable instructions that are executable by a processor.
  • the instructions cause the processor to perform any part of the techniques described herein.
  • the instructions may be, for example, software instructions, such as those used to define a software or computer program.
  • the computer-readable medium may be a computer-readable storage medium such as a storage device (e.g., a disk drive, or an optical drive), memory (e.g., a Flash memory, read only memory (ROM), or random access memory (RAM)) or any other type of volatile or non-volatile memory or storage element that stores instructions (e.g., in the form of a computer program or other executable) to cause a processor to perform the techniques described herein.
  • the computer-readable medium may be a non-transitory storage medium.
  • the collision awareness system may provide such data over a wireless network to an application (e.g., an airport moving map display (AMMD) application) onboard an aircraft.
  • an application e.g., an airport moving map display (AMMD) application
  • the collision awareness system may provide such indications of potential collision zones on an electronic flight bag (EFB), which may be implemented on a tablet computer or analogous user interface device.
  • EFB electronic flight bag
  • the flight crew may view and use the AMMD enhanced with the information from the collision awareness system while the pilot controls the aircraft on the airport ground surface, for example, during taxiing, parking, etc.
  • a tug operator may view and use the AMMD enhanced with the information from the collision awareness system while tugging an aircraft to a destination location according to ATC clearance information.
  • the collision awareness system may determine potential collision zones with one or more other surface vehicles (e.g., other aircraft or ground vehicles) and transmit warnings of the potential collision zones to the EFB. Implementations of this disclosure may thus provide better situational awareness for controlling ground movement of aircraft on airport taxiways, including in weather conditions of limited visibility, without the need for any new hardware to be installed in the aircraft itself (and thus, without the need for new hardware to be certified by relevant aviation authorities), and without requiring cooperative participation by other aircraft. Implementations of this disclosure may not only decrease the possibility of collision of an aircraft with another aircraft or surface vehicle, but may also provide additional benefits for the airport, such as smoother taxiing and fewer interruptions or delays due to confusion or lack of situational awareness in the ground traffic.
  • FIG. 1 is a conceptual block diagram depicting example components of a collision awareness system environment 102.
  • a collision awareness system may operate in such an example collision awareness system environment 102, including various example components of FIG. 1 .
  • the collision awareness system environment 102 is includes various components, including surface and/or flight vehicles 111, a traffic controller 114, one or more data server(s) 132, various databases or datastores 105, and user interface devices 104.
  • a collision awareness system may be implemented as software installed on one or more of the components of collision awareness system environment 102.
  • vehicles 112A-N may be referred to at times as being airplanes of various configurations, the techniques of this disclosure are not so limited, and vehicles 112A-N may include other vehicles, such as helicopters, hybrid tilt-rotor aircrafts, urban air vehicles, jet, quadcopters, hovercrafts, space shuttles, uncrewed aerial vehicle (UAV), flying robots, etc.
  • helicopters such as helicopters, hybrid tilt-rotor aircrafts, urban air vehicles, jet, quadcopters, hovercrafts, space shuttles, uncrewed aerial vehicle (UAV), flying robots, etc.
  • UAV uncrewed aerial vehicle
  • vehicles 113A-N may be referred to at times as being tug vehicles, the techniques of this disclosure are not so limited, and vehicles 113A-N may include other vehicles, such as unmanned ground vehicles, transient ground surface vehicles, unmanned tug vehicles (e.g., remote control vehicles), luggage cart vehicles having multiple cars attached via linkages, refueler trucks, airport busses, container loaders, belt loaders, catering vehicles, emergency vehicles, snow removal vehicles, or ground maintenance equipment, etc.
  • vehicles 111 may receive direct communications from traffic controller 114, such as via radio or cellular communication.
  • traffic controller 114 may transmit clearance information directly to one of aircraft 112 or to a tug vehicle 113, indicating a destination port for parking an aircraft.
  • user interface devices 104A-104N may include a wide variety of user interface devices.
  • user interface devices 104 may include tablet computers, laptop computers, phones, EFBs, augmented reality headsets or virtual reality headsets, or other types of user interface devices.
  • User interface devices 104 may be configured to receive surface vehicle movement data with indications of potential collision zones from a collision awareness system.
  • User interface device 104 may also be configured to generate (e.g., render) and present an AMMD that shows transient surface vehicles and indications of potential collision zones, in accordance with illustrative aspects of this disclosure, such as those of FIGS. 5A-5C .
  • Network 130 may include any number of different types of network connections, including satellite connections and Wi-FiTM connections.
  • network 130 may include networks established using geosynchronous satellites 105A, low-earth orbit satellites 105B, global navigation satellite systems 105C, cellular base station transceivers 160 (e.g., for 3G, 4G, LTE, and/or 5G cellular network access), and/or Wi-FiTM access points.
  • the geosynchronous satellites 105A and low-earth orbit satellites 105B can communicate with gateways that provide access to network 130 for one or more devices implementing the collision awareness system.
  • Cellular base station transceivers can have connections that provide access to network 130.
  • network 130 may include a wired system.
  • network 130 may include an ethernet system, such as a redundant ethernet system shown in FIG. 7 of this disclosure.
  • network 130 may include a multilateration system local area network (LAN), such as the multilateration system LAN shown in FIG. 7 of this disclosure.
  • LAN multilateration system local area network
  • any one of devices of collision awareness system environment 102 executing one or more techniques of a collision awareness system may be configured to communicate with any one of the various components via network 130.
  • a single component of collision awareness system environment 102 may be configured to execute all techniques of the collision awareness system.
  • collision awareness system may include a system that resides on vehicles 111, data server(s) 132, traffic controller 114, or user interface devices 104A/104N.
  • collision awareness system may operate as part of a software package installed on one or more computing devices.
  • traffic controller 114 may operate software that executes one or more of the various techniques of the disclosed collision awareness system.
  • a software version of collision awareness system may be installed on a computing device of traffic controller 114.
  • the disclosed collision awareness system may be included with user interface devices 104 or one or more data server(s) 132.
  • data server(s) 132 may include a cloud-based data server that implements the disclosed collision awareness system.
  • one or more data server(s) 132 may be configured to receive input data from network 130 (e.g., vehicle positional data, aerodrome guidance features, clearance information, etc.), determine a predicted collision zone, in accordance with one or more techniques of this disclosure, and may output predicted collision zone data to one or more components of FIG. 1 , such as user interface devices 104, traffic controller 114, or vehicles 111.
  • data server(s) 132 may include datastores 105.
  • some or all of datastores 105 may be embodied as separate devices that interface with other components of collision awareness system environment 102 directly or via network 130. For example, where collision awareness system is implemented at least in part on one or more of data server(s) 132, datastores 105 may interface with data server(s) 132 directly or via network 130.
  • the databases may include historical vehicle data 106, aerodrome guidance data 108, and in some instances, clearance data 110. Although shown as being a single datastore 105, the databases shown as part of datastore 105 may be embodied as separate objects.
  • a database included with a vehicle or external to the vehicle may be or include a key-value data store, such as an object-based database or dictionary.
  • a database may include any data structure (and/or combinations of multiple data structures) for storing and/or organizing data, including, but not limited to, relational databases (e.g., Oracle databases, MySQL databases, etc.), non-relational databases (e.g., NoSQL databases, etc.), in-memory databases, spreadsheets, as comma separated values (CSV) files, extensible markup language (XML) files, TeXT (TXT) files, flat files, spreadsheet files, and/or any other widely used or proprietary format for data storage.
  • relational databases e.g., Oracle databases, MySQL databases, etc.
  • non-relational databases e.g., NoSQL databases, etc.
  • in-memory databases e.g., spreadsheets, as comma separated values (CSV) files, extensible markup language (XML) files, TeXT (TXT) files, flat files, spreadsheet files, and/or any other widely used or proprietary format for data storage.
  • CSV
  • Databases are typically stored in one or more data stores. Accordingly, each database referred to herein (e.g., in the description herein and/or the figures of the present application) is to be understood as being stored in one or more data stores.
  • outgoing requests and/or incoming responses may be communicated in any suitable formats.
  • XML, JSON, and/or any other suitable formats may be used for API requests and responses or otherwise.
  • data transfer refers to both transmitting data from one of vehicles 111, traffic controller 114, data server(s) 132, or user interface devices 104 over network 130 and receiving data at user interface devices 104, data server(s) 132, traffic controller 114, or vehicles 111, over network 130.
  • the data transfer may follow various formats, such as a database format, files, XML, HTML, RDF, JSON, a file format that is proprietary to the system, data object format, or any other format, and may be encrypted or have data of any available type.
  • historical vehicle data 106 may store historical navigation route data, vehicle data (e.g., maintenance logs, safe zone envelope data, etc.).
  • An example visual depiction of certain historical navigation route data may be as shown in Table 1 below.
  • the above Table 1 may include data with respect to a particular one of vehicles 111, such as vehicle 112A or vehicle 113A.
  • the historical navigation route data may be with respect to a particular location, such as a particular aerodrome location.
  • historical navigation route data may include additional data entries for 'vehicle IDs' and 'airport identifiers.'
  • the above table is merely one example representation of certain historical navigation route data that database 105 may manage and store over time.
  • the navigation route data may be based on data received directly from each aircraft, such as from transponder data, or may include tracking data obtained otherwise, such as through external sensors.
  • data entries related to 'speed' as shown may be related to ground speed.
  • Historical vehicle data 106 may store speed in any suitable unit, such as nautical miles per hour, meters per second, etc. Historical vehicle data 106 may also store acceleration data as determined from the velocity data or as received directly from one of vehicles 111 or external sensors.
  • aerodrome guidance data 108 may include maps or other data representations of a ground surface of an airport, including guidance features configured to guide a vehicle through a particular aerodrome location.
  • the ground surface can be, for example, a taxiway, runway, gate area, apron, hangar bays, or other trafficway or ground surface of an airport.
  • description of an "airport" may apply equivalently to an airbase, an airfield, or any other type of permanent or temporary aerodrome.
  • aerodrome guidance data 108 may include multiple databases specific to a particular aerodrome or multiple aerodromes within a certain vicinity.
  • aerodrome guidance data 108 may include a fixed ground object database specific to the airport, including real-time or recent-time imaging or detection, or a combination of the two, to provide fixed ground object information and provide aerodrome guidance features, such as guidance line coordinates fixed to the surface of an aerodrome.
  • data server(s) 132 or other components of collision awareness system environment 102 may be configured to access one or more of aerodrome guidance data 108.
  • data server(s) 132 may identify a particular aerodrome location, such as a particular airport in a particular city, and access aerodrome guidance data 108 specific to the identified aerodrome location.
  • aerodrome guidance data 108 for multiple aerodromes may be included in a single datastore 105, rather than in separate datastores 105 as may be the case in some examples.
  • datastore 105 may further include clearance data 110.
  • clearance data 110 may be included as a separate datastore 105.
  • clearance data 110 may reside with a datastore stored on a computing system of traffic controller 114.
  • traffic controller 114 of a particular aerodrome may include clearance data 110.
  • Traffic controller 114 may further include other data included with datastores 105.
  • Clearance data 110 may include text or audible clearance information generated by traffic controller 114 and/or vehicle 111, as with communications between traffic controller 114 and a receiving vehicle 111.
  • traffic controller 114 may transmit taxiway or runway clearance information to one of vehicles 111 in either text format or voice message.
  • one of vehicles 111 may retrieve the taxiway or runway clearance information from traffic controller 114.
  • one of vehicles 111 may perform a database query for clearance data 110 or otherwise request clearance data 110 from traffic controller 114 or a datastore 105 storing clearance data 110.
  • the text or voice message may be directly transmitted to one of vehicles 111 from traffic controller 114 (e.g., live communication or from a clearance database 110).
  • One of vehicles 111 may then transmit the clearance information to one or more external systems (e.g., cloud systems) via an Aircraft Data Gateway Communication Unit (ADG).
  • ADG Aircraft Data Gateway Communication Unit
  • vehicle 111 or traffic controller 114 may transmit clearance information to a device executing the collision awareness system.
  • ADG Aircraft Data Gateway Communication Unit
  • traffic controller 114 may send a duplicate copy of clearance message (text or voice message) to data server(s) 132 (e.g., a cloud system) via a secured communication protocol.
  • data server(s) 132 may convert any voice related taxiway or runway clearance information to text information and store the clearance information to a predefined location of datastore 105.
  • a collision awareness system implemented on one or more components of collision awareness system environment 102 may utilize data from datastores 105 to determine predicted collision zones of vehicles. In this way, the collision awareness system may help mitigate or reduce collisions between vehicles 111 (involving body, wingtip, or other portion of vehicles 111) and other aircraft, ground vehicles, or other transient or moving objects on an airport ground surface (collectively, "transient surface objects") while aircraft 112 is taxiing, taking off, landing, or stationary, on the airport ground surface.
  • transient surface objects may refer to any aircraft, ground vehicles, or other objects on airport ground surfaces, including objects that are permanently fixed in place, and that a collision awareness system may monitor.
  • FIG. 2 is a conceptual block diagram for an example computing system 138 with an example computer-executable collision awareness system 140.
  • collision awareness system 140 may be embodied in any number of different devices, such as one or more of the components of collisions awareness system environment 102 described with reference to FIG. 1 .
  • computing system 138 implementing collision awareness system 140 may be described as executing various techniques of this disclosure across one or more data server(s) 132, such as executing on a cloud server. It will be understood, however, that computing system 138 may be implemented on traffic controller 114, user interface device(s) 104, vehicles 111, or other network devices designed to provide vehicle collision awareness.
  • collision awareness system 140 may execute on any one or more of processing circuitry 142 of computing devices corresponding to a traffic controller 114, user interface device(s) 104, vehicles 111, or other network devices, and combinations thereof.
  • collision awareness system 140 may execute based on data from storage device(s) 146 included with any one or more of processing circuitry 142 of computing devices corresponding to a traffic controller 114, user interface device(s) 104, vehicles 111, or other network devices, and/or data stores 105, in cases where one or more of databases 106, 108, or 110 are implemented as storage devices separate from storage device(s) 146.
  • storage device(s) 146 may include one or more of databases 106, 108, or 110.
  • computing system 138 may implement collision awareness system 140 via processing circuitry 142, communication circuitry 144, and/or storage device(s) 146.
  • computing system 138 may include display device 150.
  • display device 150 may include any display device, such as a liquid crystal display (LCD) or a light emitting diode (LED) display or other type of screen, with which processing circuitry 142 may present information related to predicted collision zones.
  • display device 150 may not be included with computing system 138.
  • computing system 138 may be one of data server(s) 132 configured to perform various techniques of this disclosure and transmit to another device, such as one of user interface devices 104, collision zone data for display.
  • display device 150 may configure collision zone information graphically rendered on a ground navigation application implemented by the aircraft system.
  • display device 150 may configure position/velocity information for one or more transient surface objects to be graphically rendered on a ground navigation application implemented by the aircraft system.
  • the display device may generate graphical display format data based on the position and velocity information configured compatibly with the graphical outputs of an AMMD application, such that an AMMD application may overlay, superimpose, or otherwise integrate graphical display format data with existing AMMD graphical display outputs.
  • Display device 150 generates outputs including or in the form of the graphical display format data, such that the outputs may be readily configured to be received and graphically rendered by an AMMD application executing on an EFB (e.g., on a tablet computer) in the cockpit of an aircraft in motion on the airport ground surfaces, as further described below.
  • an AMMD application executing on an EFB (e.g., on a tablet computer) in the cockpit of an aircraft in motion on the airport ground surfaces, as further described below.
  • collision awareness system 140 may provide outputs, including alerts or warnings, that may be immediately available, via display device 150, to inform pilots or other flight crew of a potential hazard of an impending collision, such that the pilot or flight crew can take appropriate action.
  • display device 150 including one or more display processors, may be incorporated in a single processor, electronic system and/or device, or software system with an integrated implementation of collision awareness system 140, in an integrated collision avoidance logic and display processing subsystem.
  • user interface device 104 may include collision awareness system 140 and display device 150 as a single device, such as an EFB.
  • processing circuitry 142 may include fixed function circuitry and/or programmable processing circuitry.
  • Processing circuitry 142 may include any one or more of a microprocessor, a controller, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or equivalent discrete or analog logic circuitry.
  • processing circuitry 142 may include multiple components, such as any combination of one or more microprocessors, one or more controllers, one or more DSPs, one or more ASICs, or one or more FPGAs, as well as other discrete or integrated logic circuitry.
  • the functions attributed to processing circuitry 142 herein may be embodied as software, firmware, hardware or any combination thereof.
  • communication circuitry 144 may include a wireless network interface card (WNIC) or other type of communication module.
  • communication circuitry 144 may have an Internet Protocol (IP) port coupled to an ethernet connection or to an output port, such that communication circuitry 144 receives outputs from processing circuitry 142.
  • IP Internet Protocol
  • Communication circuitry 144 may be configured to connect to a Wi-FiTM or other wireless network connection.
  • communication circuitry 144 may be separate from collision awareness system 140.
  • collision awareness system 140 may include processing circuitry 142 of computing system 138, whereas communication circuitry may be included as a separate computing system.
  • collision awareness system 140 may include one or more storage device(s) 146.
  • storage device(s) 146 may include one or more of datastores 105 and may be similarly configured to store data.
  • storage device(s) 146 may include any data structure (and/or combinations of multiple data structures) for storing and/or organizing data, including, but not limited to, relational databases (e.g., Oracle databases, MySQL databases, etc.), non-relational databases (e.g., NoSQL databases, etc.), in-memory databases, spreadsheets, as comma separated values (CSV) files, extensible markup language (XML) files, TeXT (TXT) files, flat files, spreadsheet files, and/or any other widely used or proprietary format for data storage.
  • relational databases e.g., Oracle databases, MySQL databases, etc.
  • non-relational databases e.g., NoSQL databases, etc.
  • in-memory databases e.g., spreadsheets, as comma separated values
  • storage device(s) 146 may include executable instructions that when executed cause processing circuitry 142 to execute various techniques of this disclosure.
  • storage device(s) 146 may include machine learning (ML) model(s) 148.
  • ML model(s) 148 may be included on a separate storage device.
  • ML model(s) 148 may be stored in datastore 105 on data server(s) 132.
  • processing circuitry 142 may execute ML model(s) 148 via network connection 130.
  • a trained ML model can be used to process and predict paths, vehicle positions, or collision zones in accordance with certain examples of this disclosure where ML models are considered advantageous (e.g., predictive modeling, inference detection, contextual matching, natural language processing, etc.).
  • ML models that may be used with aspects of this disclosure include classifiers and non-classification ML models, artificial neural networks ("NNs"), linear regression models, logistic regression models, decision trees, support vector machines (“SVM”), Naive or a non-Naive Bayes network, k-nearest neighbors (“KNN”) models, k-means models, clustering models, random forest models, or any combination thereof.
  • These models may be trained based on data stored in datastores 105. For example, certain aspects of the disclosure will be described using predicted paths generated from a ML model trains on data from datastores 105, for purposes of illustration only.
  • a ML system or pattern recognition system may be accessed or incorporated by collision awareness system 140.
  • ML model(s) 148 may incorporate knowledge of the predictable future motions of aircraft, ground vehicles, or other objects based on statistical training of one or more ML model(s) 148 or pattern recognition system based on large training data sets of past motions of aircraft, ground vehicles, and other objects as statistically sampled over a specific airport or a representative collection of airports over time.
  • Such a ML system or pattern recognition system may also incorporate statistical training of observed motions of aircraft, ground vehicles, and other transient object on airport ground surfaces as correlated with a variety of conditions such as traffic levels, weather and visibility conditions, and time of day, for example.
  • One or more initially trained ML model(s) 148 may be further refined with a large corpus of data of motions of aircraft, ground vehicles, and other transient object on airport ground surfaces in comparison with motions predicted by the one or more ML model(s) 148.
  • collision awareness system 140 of computing system 138 may implement an expert rules system that may incorporate knowledge of general airline gate assignments, specific gate assignments for a particular aircraft or a given flight, and data on assigned taxiing routes between gate areas and runways, that ML model(s) 148 may use to predict routes.
  • processing circuitry 142 may deploy a ML model 148 trained on historical navigation route data from historical vehicle data 106 and trained on general airline gate assignments to predict routes of one of vehicles 111, including worst case and best case scenario routes that may be combined to determine a single predicted route, in accordance with techniques of this disclosure.
  • collision awareness system 140 may be enabled and implemented in existing airport systems, vehicles 111, and/or user interface devices 104, with only minimal hardware or software changes. In addition, in some examples, collision awareness system 140 may be configured to provide credible false alarm mitigation as well as the capability to use a variety of data inputs such as from Automatic Dependent Surveillance - Broadcast (ADS-B) sources, and provide coverage of any types of vehicles 111 that might potentially collide with a fixed structure or another one of vehicles 111.
  • ADS-B Automatic Dependent Surveillance - Broadcast
  • FIG. 3 depicts a flowchart for an example process 300 that collision awareness system 140 may implement for providing collision zone predictions, in accordance with illustrative aspects of this disclosure.
  • Process 300 may include some features that may be optional in some examples.
  • process 300 includes obtaining (e.g., by processing circuitry 142 of collision awareness system 140) historical navigation route data for one or more vehicles 111 (302).
  • processing circuitry 142 may identify historical navigation route data from historical vehicle data 106.
  • the historical navigation route data may be based at least in part on transponder positional data from vehicles 111.
  • vehicles 111 may transmit, via network 130, navigation route data that may be stored over time as historical navigation route data in historical vehicle datastore 106.
  • Process 300 further includes identifying (e.g., by processing circuitry 142) a plurality of aerodrome guidance features for a particular aerodrome location (304).
  • processing circuitry 142 may identify aerodrome guidance features for a particular airport from aerodrome guidance data 108.
  • the aerodrome guidance features may include guidance marker information, such as guidance signs and guidance lines of an aerodrome location.
  • guidance lines may include coordinates of line markings fixed to an aerodrome surface, such as by being painted to a ground surface).
  • Process 300 further includes determining (e.g., by processing circuitry 142) a predicted path of a first vehicle (306).
  • the predicted path may include a first portion of the predicted path and a second portion of the predicted path.
  • the first portion of the predicted path may include areas of an aerodrome including guidance features, such as on a taxiway or runway, whereas the second portion of the predicted path may include areas of an aerodrome that do not include defined guidance features or areas for which guidance features are unavailable from a database.
  • processing circuitry 142 may predict the first portion of the predicted path using the aerodrome guidance features, such as particular guidance marker or guidance signs information, and may predict the second portion of the predicted path using the historical navigation route data from datastore 106.
  • processing circuitry 142 may predict the first portion of the predicted path using historical navigation route data from datastore 106.
  • processing circuitry 142 may receive information from a particular one of vehicles 111 regarding speed information of vehicle 111 (e.g., the current speed of an aircraft).
  • processing circuitry 142 may align the historical path points (e.g., as described with reference to Table 1), speed, and time parameters with surface guidance lines included with aerodrome guidance data 108.
  • the historical navigation route data may indicate how long on average a vehicle takes to travel from one position along a guidance line to another position along the guidance line depending on various factors, such as time of day, vehicle size and weight, traffic flow information, etc.
  • processing circuitry 142 may estimate various ahead positions along the guidance lines based on the historical navigation route data.
  • An illustrative example of processing circuitry 142 using historical navigation route data aligned along (e.g., mapped to) an example aerodrome guidance feature is described below with reference to FIG. 4 .
  • the predicted path of the first vehicle is based on a combination of initially predicted paths of the first vehicle, the initially predicted paths including a likelihood of the first vehicle traveling a first initially predicted path and a likelihood of the first vehicle traveling a second initially predicted path.
  • processing circuitry 142 may combine or average a predicted path representing a plurality of best case historical paths and worst case historical path segment to generate the predicted path.
  • the predicted path may be specific to a particular one of vehicles 111 and may include a path connecting points for the particular one of vehicles 111 from a first time (e.g., A1(t1)) to one or more other prospective times (e.g., A1(t1 + n), where n represents an integer to be added to t representing a first time).
  • processing circuitry 142 may determine at least two initially predicted paths of the first vehicle.
  • the processing circuitry 142 may also identify a likelihood of the first vehicle traveling any of the initially predicted paths.
  • the likelihood may include an indication of a best case path, meaning that the predicted path is most likely to occur, or an indication of a worst case path, meaning that the predicted path is least likely to occur, or other paths that have likelihoods that fall between best case and worse case paths.
  • processing circuitry 142 may determine the predicted path of the first vehicle based on a combination of at least two initially predicted paths of the first vehicle. The combination may be based on processing circuitry 142 deploying a ML model able to determine initially predicted paths, combine predicted paths, or both to determine a combined predicted path.
  • the combination may be based on the likelihood information, such that more weight is provided paths that are more likely to occur under the circumstances and less weight is provided paths that are less likely to occur under the same circumstances.
  • processing circuitry 142 may determine a weighted average of initially predicted paths or deploy one of ML model(s) 148 to determine a weighted average or other combinations of initially predicted paths.
  • processing circuitry 142 may classify and transfer the predicted path data to datastore 106. Processing circuitry 142 may classify the predicted path based on aircraft type, such that the predicted path may be referenced for future use with similarly situated vehicles 111.
  • processing circuitry 142 may retrieve certain historical data from historical vehicle data 106, to identify a predicted path based on a particular vehicle type and/or current state information of the vehicle.
  • the particular vehicle may be an ownship executing the techniques of this disclosure on an EFB located in the cockpit of the ownship.
  • processing circuitry 142 of the EFB may select the best suitable historical data for determining a predicted path.
  • the selected best suitable historical data may be dynamic from location-to-location for identifying predicted paths using the ownship parameters.
  • ML models may be used for best suitable path selection based on the current data and historical data.
  • processing circuitry 142 may identify clearance information of a traffic controller defining one or more target marks for the first vehicle 111. For example, processing circuitry 142 may query a database for clearance information.
  • the clearance information may include a destination location as one target mark for the first vehicle but may also include multiple target marks along the way to the destination location, such that a vehicle will follow the path along the target mark to reach the destination location.
  • processing circuitry 142 may identify one or more target aiming features from the plurality of aerodrome guidance features based at least in part on the clearance information.
  • the one or more target aiming features may be configured to provide guidance through a particular part of an aerodrome toward the one or more target marks.
  • the target aiming features may include aerodrome guidance features that aim or steer a vehicle toward a target, such as by providing arrows, whether virtual or real, that guide vehicle 111 to a target.
  • processing circuitry 142 may identify the first portion of the predicted path using the one or more target aiming features.
  • the first portion may include aerodrome guidance features, such that historical navigation data and aerodrome guidance features may be used in conjunction with one another to determine a predicted path of a vehicle 111 through the first portion of the predicted path.
  • processing circuitry 142 may identify the second portion of the predicted path using the clearance information and historical navigation route data, the second portion of the predicted path including a destination location of the first vehicle defined by the clearance information as a target mark of the one or more target marks.
  • the second portion of the predicted path may be through an apron area of an aerodrome that does not include guidance features and thus, historical navigation data may be used to predict the portion of the path.
  • the first portion and the second portion may be switched in some examples, such as when a vehicle is leaving an apron or gate area toward a runway. That is, in some examples, the second portion of the predicted path may include an airport apron region or may include a taxiway comprising surface guidance markings, depending on which direction the vehicle 111 is destined to travel (e.g., toward a gate, toward a runway, toward a hangar bay, or somewhere between, etc.).
  • Process 300 further includes determining (e.g., by processing circuitry 142) a predicted position of the first vehicle along the predicted path at a prospective time (308).
  • processing circuitry 142 may implement regression algorithms to predict an immediate accurate position using the previous position, speed, and heading information (e.g., A1(t + 1) to A1(t + 2), where time t is in seconds).
  • the regression models are used to minimize position deviation error in the historical data.
  • data points such as those shown in Table 1 above, may be used to calculate a cumulative distance of a particular vehicle 111. For example, processing circuitry 142 may calculate the cumulative distance from A1(t1) to A1(t + n).
  • processing circuitry 142 may determine all intermediate path points of a path segment from time 't' to 't + n', assuming all intermediate points are locally linear.
  • processing circuitry 142 may utilize a function, such as a great circle distance formulae, great circle earth model, or an equivalent projection system formulae to determine position information based on the calculated distance and direction.
  • Processing circuitry 142 may determine the directionality of one of vehicles 111 from the predicted positions along the predicted path. As shown in FIGS. 5A-5C , A1(t1) may be determine based on the function using the current position of the particular vehicle 111 and a cumulative distance according to equations [1] or [2] below.
  • processing circuitry 142 may determine movement information of the first vehicle at a current position.
  • the movement information may include speed information of the first vehicle.
  • processing circuitry 142 may receive sensor data or transponder data from one of vehicles 111 indicating a rate at which the particular vehicle 111 is traveling.
  • processing circuitry 142 may identify the predicted position of the first vehicle using the movement information and the historical navigation route data. For example, processing circuitry 142 may determine how much distance the particular vehicle 111 will travel along a predicted path based on the rate at which the particular vehicle 111 is traveling at a current position along the predicted path.
  • Process 300 further includes determining (e.g., by processing circuitry 142) a predicted position of a second vehicle with respect to approximately the same prospective time (310).
  • processing circuitry 142 may determine a predicted position of the first one of vehicles 111 at a time 15 seconds in the future and thus, may determine a predicted position of another one of vehicles 111 at a time 15 seconds in the future.
  • processing circuitry 142 may determine a predicted position of each vehicle at any number of prospective times, and processing circuitry 142 will likely find a time or range of times in the future that represent when a vehicle collision is likely to occur (e.g., 14-15 seconds in the future).
  • the prospective time corresponding to the predicted position of the second vehicle may be the same as the prospective time corresponding to the predicted position of the first vehicle.
  • the predicted position for both vehicles may correspond to a prospective time of 15 seconds in the future.
  • the predicted positions may not correspond to the exact same prospective times.
  • the predicted positions may be determined at different intervals.
  • the predicted position for a first vehicle may be determined on a second-by-second basis, whereas the predicted position for a second vehicle may be determined on a half-second or every other second basis.
  • processing circuitry 142 may perform interpolation techniques to predict collision zones at times that are approximately the same (e.g., within a half second or seconds of one another), but may not be exactly the same.
  • Process 300 further includes performing (e.g., by processing circuitry 142) a comparison of vehicle envelopes for the first vehicle and the second vehicle at the predicted positions (312).
  • processing circuitry 142 may retrieve vehicle envelope data from historical vehicle data 106.
  • the vehicle envelope data may include safe zone envelopes for a single vehicle 111 or multiple vehicles 111, such as in the case of an aircraft 112A being towed by a vehicle tug 113A.
  • Process 300 further includes identifying (e.g., by processing circuitry 142) an overlap of vehicle envelopes (314).
  • processing circuitry 142 may predict a position of one vehicle 112A as turning toward a stationary vehicle 112B.
  • Processing circuitry 142 may determine that through the turn, a safe zone envelope of vehicle 112A will overlap with stationary vehicle 112B and thus, may identify a predicted collision zone. That is, process 300 may be concluded after determining (e.g., by processing circuitry 142) a predicted collision zone of the first vehicle and the second vehicle at the prospective time based at least in part on the overlap of the vehicle envelopes (316).
  • processing circuitry 142 uses the techniques and data described in process 300 to predict movement of vehicles 111, where an exact location may not be available for the vehicle 111 based on sensor or transponder data.
  • processing circuitry 142 executing process 300 may include processing circuitry 142 of one or more remote data server(s) 132.
  • processing circuitry 142 of the one or more remote data server(s) 132 may receive a current position of the first vehicle 111 or an indication of the current position.
  • Processing circuitry 142 may execute all or some of process 300 in order to determine a predicted collision zone.
  • processing circuitry 142 of the remote server may transmit the predicted collision zone from the remote server to the first vehicle 111, such as to an EFB or other user interface device 104 that corresponds to vehicle 111.
  • FIG. 4 depicts an example technique that collision awareness system 140 may implement for aligning historical navigation routes with aerodrome guidance features when determining predicted positions of vehicles 111.
  • processing circuitry 142 may align historical navigation routes so as to coincide with aerodrome guidance features, such as surface guidance lines.
  • processing circuitry 142 may predict a path for a first vehicle 111 from an interim source point 404A to an interim destination point 404B (e.g., target marks).
  • the source and destination points 404 may be in an area of an aerodrome having aerodrome guidance features.
  • a surface guidance line 406 may be between source and destination points 404.
  • processing circuitry 142 may determine source and destination points 404 based on various predicted points along a predicted path, where the predicted path may be updated as vehicle 111 approaches each predicted point along the predicted path. In some examples, the source and destination points 404 may be based on vehicle clearance data received from traffic controller 114. In some examples, processing circuitry 142 may utilize a combination of clearance data and historical navigation route data to determine points 404A and 404B configured to guide vehicle 111 to an ultimate destination point that may deviate from areas of the aerodrome having surface guidance lines, such as surface guidance line 406.
  • processing circuitry 142 may determine historical navigation route data 410 between points 404A and 404B. Processing circuitry 142 may determine the historical navigation route data 410 from aerodrome guidance data 108. In some examples, historical navigation route data 410 of FIG. 4 may be a combination (e.g., an average) of a plurality of predicted paths, combined into a single predicted path 410 made up of various predicted points along predicted path 410 (e.g., based on weighted averages based on likelihood of each predicted path).
  • Processing circuitry 142 may align historical data points 410 along a predicted path between points 404A and 404B to determine an aligned predicted path 412.
  • the aligned predicted path 412 may be aligned along surface guidance features, such as surface guidance line 406.
  • Processing circuitry 142 may use aligned predicted path 412 to determine predicted points (e.g., target marks) along the aligned predicted path 412 at prospective times in order to determine collision zones in accordance with various techniques of this disclosure.
  • target marks refer to various points on a surface that a vehicle may target as the vehicle proceeds along a path in order for the vehicle to navigate to a final target mark or final destination.
  • Target marks may change over time, e.g., as processing circuitry 142 updates the predicted path.
  • Processing circuitry 142 may further update predicted positions along the updated predicted paths over time, e.g., based on changes in vehicle speed.
  • FIGS. 5A-5C depict conceptual diagrams of a portion of an aerodrome 500 with various aircraft and ground vehicles on airport runways, taxiways, and other aerodrome ground surfaces.
  • the various aircraft and ground vehicles shown in FIGS. 5A-5C include aircraft 112 and ground vehicles 113 (designated in FIGS. 5A-5C as A1-A4 for simplicity).
  • collision awareness system 140 may be configured to determine predicted paths and positions of vehicles 111, actual positions and velocities of the vehicles 111, determine alert envelopes, and predict collision zones of vehicles 111.
  • Collision awareness system 140 may then output the position and velocity information for the one or more transient surface objects and indications of potential collision zones to network 130, such that these outputs from collision awareness system 140 may be received by an EFB on at least one of the vehicles 111 among the transient surface objects on the ground surfaces of aerodrome 500.
  • FIG. 5A illustrates a simplified example of two vehicles A1 and A2, which may be aircraft of vehicles 112A-112N, but will be referred to as vehicles A1 and A2 for simplicity in illustrative a time progression using tX indicators.
  • tO-tX indicate time in seconds.
  • t0 indicates an initial starting point of time
  • t5 indicates a predicted position after 5 second has passed.
  • vehicle A1 has received clearance information to park at a particular destination location.
  • Processing circuitry 142 may predict a path 502 of vehicle A1 in accordance with various techniques of this disclosure.
  • processing circuitry 142 may deploy a ML model to determine a combined predicted path determined from best case and worst case path predictions as informed by historical navigation route data at aerodrome 500 or other aerodromes.
  • Processing circuitry 142 may determine a predicted position of vehicle A1 along predicted path 502 at any time interval. While the examples of FIGS. 5A-5C show 5 second intervals, the techniques of this disclosure are not so limited, and any time interval may be used including a variable time interval. In the example of FIG. 5A , four predicted positions are predicted for vehicle A1 at 5 seconds, 10 seconds, 15 seconds, and 30 seconds.
  • Processing circuitry 142 may determine the predicted positions using historical navigation route data, aligned with aerodrome guidance features where available, or in some examples, may use historical navigation route data without aerodrome guidance features where the aerodrome guidance features are unavailable, such as in areas of aerodrome 500 where guidance lines are nonexistent (e.g., apron area, a gate area, etc.).
  • Processing circuitry 142 may further predict a predicted path and position of vehicle A2.
  • vehicle A2 may be an aircraft that received clearance information from traffic controller 114 indicating one or more target marks for vehicle A2.
  • processing circuitry 142 may identify a predicted path of the second vehicle using a predicted current position of the second vehicle.
  • processing circuitry 142 may predict the current position of the second vehicle based on the historical navigation route data.
  • processing circuitry 142 may deploy a ML model trained on historical navigation route data, aerodrome guidance features, and/or clearance information to determine the predicted path.
  • the ML model may identify patterns in the historical navigation route data, aerodrome guidance features, and/or clearance information that indicate a predicted path that a vehicle is likely to take toward a target mark or target destination location.
  • processing circuitry 142 may predict positions of vehicle A2 along predicted route 503, the predicted positions including at least one time that coincides with a predicted position time with respect to vehicle A1 (e.g., t10, t15, t30). In some examples, processing circuitry 142 may determine the predicted position of the second vehicle using the predicted current position of the second vehicle and one or more of: the historical navigation route data, the plurality of aerodrome guidance features, or clearance information for the second vehicle. In the example of FIG. 5A , processing circuitry 142 may determine that an overlap of safe zone envelopes for both vehicles A1 and A2 will occur at a prospective time of 15 seconds in the future, unless certain changes are made to the system, such as a slowing or speeding up of one or the other vehicle.
  • Processing circuitry 142 may perform another prediction at various intervals using new information as the information becomes available, such as velocity or acceleration data of vehicles A1 and A2. In any event, processing circuitry 142 may identify the predicted collision zone as the area of overlap 508. While safe zone envelopes are shown as a circle in FIGS. 5A-5C , safe zone envelopes may be any shape and may be specific to the shape of the particular vehicle 111. For example, where vehicle 112A (e.g., A1) is a particular aircraft having a particular size and shape, the safe zone envelope for vehicle 112A may resemble the size and shape of vehicle 112A, such that the detected overlap will indicate where on the vehicle the overlap is predicted to occur. In the example of FIG. 5A , processing circuitry 142 may determine the overlap of envelopes 508 indicates that the nose of vehicle A2 is predicted to collide with the left wing of vehicle A1 at a prospective time of t15.
  • vehicle 112A e.g., A1
  • the safe zone envelope for vehicle 112A may resemble the
  • processing circuitry 142 If there are any collision zones ahead to the ownship, processing circuitry 142 generates a visual or text notification for display on user interface device 104.
  • the below equation [1] is used for computing the cumulative distance along the track from time 't' to time 't + n' where 't' is in seconds and 'n' is a positive integer value.
  • distance u ⁇ t + 1 2 ⁇ a ⁇ t 2
  • 'u' refers to the velocity or speed
  • 't' refers to the change in time
  • 'a' refers to the acceleration (change in speed).
  • vehicle A3 represents an aircraft tug vehicle pulling an aircraft.
  • Processing circuitry 142 may predict a path of vehicle A3 and determine that vehicles A2 and A3 are not predicted to collide because at 5 seconds, vehicle A1 is predicted to be beyond the intersection point of the predicted path of vehicles A1 and A3.
  • processing circuitry 142 may predict a first portion of a predicted path of vehicle A3 using historical navigation route data, such as in the apron area of aerodrome 500.
  • processing circuitry 142 may predict a second portion of predicted path of vehicle A3 using historical navigation route data and aerodrome guidance features, such as in an area of aerodrome 500 that includes guidance features that vehicles 111 are expected to follow to reach a predefined destination.
  • FIG. 5C The example of FIG. 5C is similar except that a vehicle A4 is predicted to be located near the path of A3.
  • vehicle A4 may be a parked vehicle that has the avionics system turned off.
  • processing circuitry 142 may use historical navigation route data to determine the predicted position of vehicle A4 and determine a predicted path of vehicle A4.
  • vehicle A4 does not have a predicted path in the foreseeable future based on clearance information, historical navigation route data, and/or other aircraft information available to processing circuitry 142, such as flight times relating to vehicle A4, etc.
  • processing circuitry 142 may determine whether vehicle A3 will have enough clearance to follow along a predicted path without clipping parked vehicle A4.
  • processing circuitry 142 may determine a predicted collision zone between vehicles A3 and A4 and provide a notification for display on one of user interface devices 104.
  • FIG. 6 is a diagram of an example graphical output display 610 of an airport ground surface map that may be implemented by a two-dimensional airport moving map display (2D AMMD) application that may be implemented on an EFB.
  • a 2D AMMD application may be implemented on one of user interface devices 104, such as an EFB tablet computer in the cockpit of a particular aircraft (e.g., aircraft 112 in FIG. 1 ), e.g., while the aircraft is on the ground and, for example, taxiing or otherwise moving.
  • the 2D AMMD application may be implemented on another device, other than or in addition to, one of user interface device(s) 104.
  • a graphical output display analogous to graphical output display 610 may be implemented by a three-dimensional AMMD application that may be implemented on an EFB or other application package executing on a tablet computer or other type of computing and display device.
  • the AMMD application graphical output display 610 includes representations (e.g., graphical icons) of transient surface vehicles that may be received and/or decoded by a transient surface object overlay module of an AMMD application executing on user interface device 104 that provides AMMD 610.
  • AMMD 610 thus includes a graphical icon of an ownship 612 (e.g., that may correspond to vehicles 111 in FIG. 1 and one of vehicles A1-A4 in FIGS.
  • graphical icons of other moving vehicles 614, 616, and 618 e.g., corresponding to other vehicles 111
  • graphical icons of ground vehicles 622, 624 e.g., corresponding to ground vehicles 113
  • AMMD 610 also includes representations of aerodrome guidance features, such as surface guidance markers 642 and 644.
  • AMMD 610 may also include representations of taxiways 634, 636, and 638, and apron areas 652, 654, and 656 near airport terminal building portions 662, 664, and 666.
  • AMMD 610 may include indications of potential collision zones provided by collision awareness system 140, such as warning graphic 670 and textual warning notice 672 between ownship icon 612 and aircraft icon 614, based on a predicted collision zone determined and transmitted by collision awareness system 140.
  • ownship 612 may have a predicted route leading from between surface guidance markers 642 to apron area 656.
  • a first portion of the predicted route may include portions of taxiway 634 and a second portion of the predicted route may include portions of apron area 656.
  • the first portion may correspond to areas of the particular aerodrome location that include aerodrome guidance features, such as surface guidance markers, whereas the second portion may correspond to areas of the particular aerodrome location that do not include aerodrome guidance features, such as apron areas.
  • AMMD 610 may display predicted positions of vehicle 614 and/or vehicle 612 at or near predicted collision zone 670. AMMD 610 may also display predicted routes along with current positions and one or more predicted positions over time. In another example, AMMD 610 may display one or more predicted positions of second vehicle 614, such that a user may view the predicted route and predicted positions contributing to the predicted collision zone of ownship 612 and second vehicle 614. The user may toggle on and off various aspects of displayed information, such as toggling on or off predicted positions of.
  • predicted positions and/or predicted routes may be displayed as holograms or otherwise, faint depictions, of vehicle movement or stationary location of a vehicle so as to indicate to a user that the position or route is not an actual route, but instead represents a predicted route that is subject to change over time based on predictions from collision awareness system 140.
  • collision awareness system 140 may connect to an aircraft system of a particular aircraft 612 (e.g., the EFB application running on user interface device 104) over the extended range wireless network (via a wireless router, such as wireless router 710 of FIG. 7 ), where the particular aircraft may be among the transient surface objects that collision awareness system 140 is monitoring.
  • collision awareness system 140 may establish a secure wireless communication channel over the extended range wireless network with the EFB application running on user interface device 104, or with another aircraft system on the particular aircraft, and then transmit its information, including the position and velocity information for the one or more transient surface objects, over the secure wireless communication channel.
  • the EFB application executing on user interface device 104 may thus receive all of the information transmitted by collision awareness system 140, and receive all of the benefit of collision awareness system 140, simply with a software upgrade to an EFB application that implements examples of this disclosure.
  • a pilot or flight crew may gain the benefits of this disclosure without requiring any new hardware (since an EFB application of this disclosure may execute on an EFB tablet computer or other EFB that a flight crew already has), and without requiring any hardware or software change to the installed equipment of the aircraft itself, and thus without the need to go through a certification process of any new installed aircraft systems.
  • a pilot, flight crew, or aircraft operator may also enjoy the benefits of an implementation of this disclosure without the need to rely on new hardware or software from an original equipment manufacturer (OEM) of installed hardware or software systems installed in the aircraft.
  • OEM original equipment manufacturer
  • FIG. 7 is a conceptual block diagram depicting an example aerodrome network system with example ground surface sensors that may be used in conjunction with collision awareness system 140.
  • the example aerodrome network system includes collision awareness system 140 connected to a wireless router 710 via communication circuitry 144.
  • Collision awareness system 140 may be communicatively connected to a number of airport ground surface sensors of various types, including a surface movement radar (SMR) transceiver 720, multilateration sensor(s) 722, multilateration reference transmitter 724, and/or to additional types of airport ground surface sensors via multilateration system LAN 770 or redundant airport system ethernet local area networks (LANs) 716A and 716B ("airport system LANs 716").
  • SMR surface movement radar
  • LAN 770 redundant airport system ethernet local area networks
  • Multilateration sensor(s) 722 may gather data on the movement of surface vehicles 111 and provide the data to collision awareness system 140 via network 130 (e.g., airport system ethernet LAN 716, etc.), such that collision awareness system 140 may use such data to confirm various predictions based on non-sensor data.
  • network 130 e.g., airport system ethernet LAN 716, etc.
  • processing circuitry 142 may receive ground surface sensor data for a first vehicle and/or a second one of vehicles 111. Processing circuitry 142 may receive ground surface sensor data collected as described in various techniques of U.S. Patent Publication No. 2016/0196754, by Lawrence J. Surace, filed Jan. 6, 2015 , the entire content of which is hereby incorporated by reference in its entirety.
  • SMR transceiver 720 is connected to at least one SMR antenna 726.
  • SMR transceiver 720 and SMR antenna 726 may be configured to detect, monitor and gather data from various airport ground surfaces and detect transient surface objects on the airport ground surfaces, including aircraft, ground vehicles, and any other moving or impermanent objects on the ground surfaces (or "transient surface objects").
  • processing circuitry 142 may use data from one or more SMR transceivers 720, multilateration sensors 722, or other airport ground surface sensors, and combine the data from these multiple airport ground surface sensors to generate position and velocity information for the one or more transient surface objects on the airport ground surfaces.
  • Processing circuitry 142 may use the position and velocity information for the one or more transient surface objects on the airport ground surfaces to determine predicted positions along predicted paths by extrapolating a position using the current position, velocity information and predicted changes in velocity or position as informed by historical navigation route data.
  • processing circuitry 142 may then determine a current position of the first vehicle and/or the second vehicle from the ground surface sensor data. In some instances, processing circuitry 142 may determine the current position for one vehicle, whereas the other vehicle may be parked and out of range of the ground surface sensors. In any event, processing circuitry 142 may predict a current position of the other vehicle using historical navigation route data. In such examples, processing circuitry, may identify both a predicted position of the first vehicle and a predicted position of the second vehicle using the current position of the first and second vehicles 111.
  • Multilateration sensors 722 may be configured to detect, monitor and gather data from various airport ground surfaces and to detect transient surface objects on the airport ground surfaces, in ways that may complement the detection by SMR transceiver 720.
  • Example multilateration sensor data collection techniques are described in U.S. Patent Publication No. 2016/0196754 .
  • multilateration sensors 722 may be implemented as omnidirectional antenna sensors stationed at various remote locations around the airport.
  • collision awareness system 140 may be connected to any one or more sensors of a wide variety of other types of sensors configured to detect transient surface objects on the airport ground surfaces.
  • processing circuitry 142 of collision awareness system 140 may be communicatively connected to and configured to receive data from one or more microwave sensors, optical imaging sensors, ultrasonic sensors, lidar transceivers, infrared sensors, and/or magnetic sensors.
  • collision awareness system 140 may incorporate features and/or components of an airport ground surface monitoring system, such as Advanced Surface Movement Guidance and Control System (A-SMGCS) or Airport Surface Detection Equipment-Model X (ASDE-X) System.
  • A-SMGCS Advanced Surface Movement Guidance and Control System
  • ASDE-X Airport Surface Detection Equipment-Model X
  • collision awareness system 140 may incorporate one or more of the SMR transceiver 720 and SMR antenna 726, multilateration sensors 722, and/or other airport ground surface sensors.
  • collision awareness system 140 may incorporate or integrate signals or sensor input from a combination of surface movement radar, multilateration sensors, and satellites.
  • One or more of the types of airport ground surface sensors may be configured to generate signals indicative of positions of transient ground objects to within a selected accuracy, such as five meters, for example, enabling processing circuitry 142 to generate position and velocity information for the transient ground objects of a similar level of accuracy.
  • Processing circuitry 142 may also be at least at times (at all times or at only certain times) communicatively connected to sensors positioned outside the vicinity of the airport, such as imaging sensors hosted on satellites, airships, or drones with imaging and communication capability.
  • Processing circuitry 142 may be configured to use data from SMR transceiver 720 and/or multilateration sensors 722 and multilateration reference transmitter 724 to evaluate or determine positions and velocities of transient surface objects on the airport ground surfaces, and to generate position and velocity information for one or more transient surface objects on the airport ground surfaces based at least in part on data from the SMR transceiver 720 and/or from the multilateration sensors 722, and/or from one or more other airport ground surface sensors.
  • processing circuitry 142 may generate positions and velocities at one or more times of one or more airport ground vehicles or other ground support equipment, such as refueler trucks, pushback tugs, airport busses, container loaders, belt loaders, baggage carts, catering vehicles, emergency vehicles, snow removal vehicles, or ground maintenance equipment, for example.
  • airport ground vehicles or other ground support equipment such as refueler trucks, pushback tugs, airport busses, container loaders, belt loaders, baggage carts, catering vehicles, emergency vehicles, snow removal vehicles, or ground maintenance equipment, for example.
  • Multilateration sensors 722 may in some examples perform active cooperative interrogation of moving aircraft on the airport ground surfaces. For example, multilateration sensors 722 may transmit interrogation signals via a 1030/1090 megahertz (MHz) Traffic Collision Avoidance System (TCAS) surveillance band.
  • multilateration sensors 722 may include Automatic Dependent Surveillance - Broadcast (ADS-B) transceivers (e.g., Mode S ADS-B transceivers) configured to receive ADS-B messages from aircraft on the airport ground surface.
  • ADS-B Automatic Dependent Surveillance - Broadcast
  • Various aircraft in motion on the airport ground surfaces at least, aircraft that have their ADS-B systems active while on the ground
  • Multilateration sensors 722 using ADS-B may receive the ADS-B messages and communicate the ADS-B messages, potentially with additional data such as time of receipt, to processing circuitry 142, thus facilitating processing circuitry 142 determining and generating position and velocity information for the responding aircraft.
  • processing circuitry 142 of collision awareness system 140 may be configured to output the position and velocity information generated for transient surface objects to communication circuitry 144 and thus to extended range wireless router 710 for transmission over a wireless local area network.
  • processing circuitry 142 of collision awareness system 140 may output position and velocity information for one or more transient ground surface objects at a selected ethernet connection or output port to an IP address, where the selected ethernet connection or output port is connected to communication circuitry 144, for example, via a WNIC.
  • the extended range wireless network established by wireless router 710 may extend its range across the airport, and include all of the taxiways, runways, gate areas, apron areas, hangar bays, and other trafficways in its range.
  • the extended range wireless network provided by wireless router 710 may thus include all of the aircraft on the airport ground surfaces within range, and may potentially provide wireless connectivity in the cockpits of all of the aircraft, including to EFBs of the pilots or flight crew of the various aircraft.
  • extended range wireless router 710 may be incorporated together with collision awareness system 140 in a single unit or component.
  • processing circuitry 142 is configured to receive data from one or more airport ground surface sensors (e.g., one or both of SMR transceiver 720 and multilateration sensors 722) configured to detect transient surface objects on an airport ground surface.
  • processing circuitry 142 of collision awareness system 140 may be further configured to generate position and velocity information for one or more transient surface objects on the airport ground surface based at least in part on the data from the one or more airport ground surface sensors.
  • Communication circuitry 144 of collision awareness system 140 may be configured to receive the position and velocity information for the one or more transient surface objects from processing circuitry 142 and to output the position and velocity information for the one or more transient surface objects to wireless router 710 for transmission over a wireless local area network.
  • collision awareness system 140 any of a wide variety of processing devices, such as collision awareness system 140, other components that interface with collision awareness system 140 and/or implement one or more techniques of collision awareness system 140, or other central processing units, ASICs, graphical processing units, computing devices, or processing devices of any other type may perform process 300 or portions or aspects thereof.
  • Collision awareness system 140 and/or other components that interface with collision awareness system 140 and/or implement one or more techniques of collision awareness system 140 as disclosed above may be implemented in any of a variety of types of circuit elements.
  • processors of collision awareness system 140 or other components that interface with collision awareness system 140 and/or implement one or more techniques of collision awareness system 140 may be implemented as one or more ASICs, as a magnetic nonvolatile random-access memory (RAM) or other types of memory, a mixed-signal integrated circuit, a central processing unit (CPU), a field programmable gate array (FPGA), a microcontroller, a programmable logic controller (PLC), a system on a chip (SoC), a subsection of any of the above, an interconnected or distributed combination of any of the above, or any other type of component or one or more components capable of being configured to predict collision zones at a prospective time using guidance features of an aerodrome, historical data, and/or clearance information, and perform other functions in accordance with any of the examples disclosed herein.
  • ASICs application-programmable gate array
  • processors including one or more microprocessors, DSPs, ASICs, FPGAs, or any other equivalent integrated or discrete logic circuitry, as well as any combinations of such components, embodied in electronics included in components of system 140 or other systems described herein.
  • processors including one or more microprocessors, DSPs, ASICs, FPGAs, or any other equivalent integrated or discrete logic circuitry, as well as any combinations of such components, embodied in electronics included in components of system 140 or other systems described herein.
  • processors including one or more microprocessors, DSPs, ASICs, FPGAs, or any other equivalent integrated or discrete logic circuitry, as well as any combinations of such components, embodied in electronics included in components of system 140 or other systems described herein.
  • processors including one or more microprocessors, DSPs, ASICs, FPGAs, or any other equivalent integrated or discrete logic circuitry, as well as any combinations of such components, embodied in electronics included in components of system 140 or other systems described herein.
  • Such hardware, software, firmware may be implemented within the same device or within separate devices to support the various operations and functions described in this disclosure.
  • any of the described units, modules or components may be implemented together or separately as discrete but interoperable logic devices. Depiction of different features as modules or units is intended to highlight different functional aspects and does not necessarily imply that such modules or units must be realized by separate hardware or software components. Rather, functionality associated with one or more modules or units may be performed by separate hardware or software components, or integrated within common or separate hardware or software components. For example, while collision awareness system 140 is shown as a separate system in FIG.
  • collision awareness system 140 may execute on one or more of data server(s) 132, vehicles 111, traffic controller 114, user interface devices 104, data stores 105, or any combination thereof. In one example, collision awareness system 140 may be implemented across multiple devices, such as data server(s) 132 and vehicles 111 simultaneously.
  • functionality ascribed to the devices and systems described herein may be embodied as instructions on a computer-readable medium such as random access memory (RAM), read-only memory (ROM), non-volatile random access memory (NVRAM), electrically erasable programmable read-only memory (EEPROM), FLASH memory, magnetic data storage media, optical data storage media, or the like.
  • RAM random access memory
  • ROM read-only memory
  • NVRAM non-volatile random access memory
  • EEPROM electrically erasable programmable read-only memory
  • FLASH memory magnetic data storage media
  • optical data storage media or the like.
  • the instructions may be executed to support one or more aspects of the functionality described in this disclosure.
  • the computer-readable medium may be non-transitory.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Traffic Control Systems (AREA)

Abstract

This disclosure is directed to methods, computer program products, and systems for providing surface vehicle tracking data, including indications of potential collision zones, to an airport map display system onboard an aircraft. In one example, a method includes identifying historical navigation route data, aerodrome guidance features, and a predicted path of a first vehicle. The method further includes determining predicted positions along the predicted path and determining predicted positions of a second vehicle and comparing vehicle envelopes for the two vehicles to determine a predicted collision zone of the vehicles.

Description

  • This application claims priority to Indian Provisional Application 202011006508, filed February 14, 2020 , the entire content of which is hereby incorporated by reference.
  • This disclosure relates to collision awareness for vehicles.
  • BACKGROUND
  • As commercial aviation traffic has continued to increase over the years, airports have grown increasingly busy. As such, collision avoidance systems have been implemented that help prevent potential collisions of an aircraft with another aircraft using various sensors, imaging devices, radar, and other hardware components installed on an aircraft. Such hardware components increase the weight, maintenance complexity, and in general, overall cost, for such vehicles.
  • An increasing amount of air traffic has also involved very large airliners with very long wingspans, which may sometimes reduce wingtip clearance margins while the aircraft is in motion on airport ground surfaces. In addition, multiple aircraft in an area may be powered down at any given time or otherwise, not emitting a tracking beacon that may be used to reduce the likelihood of collision with another vehicle. An aircraft may be powered down and pulled under tug by a tug vehicle, in which case, aircraft ground collisions or collisions between aircraft and other vehicles may be even more likely to occur. Furthermore, in areas of an aerodrome where aircraft or aircraft tug vehicles navigate unmarked routes, such as in an airport apron area or hangar bay, wingtip collisions may occur at an even higher rate due to the seemingly unrestricted route an aircraft or tug vehicle can take to reach an intended destination.
  • SUMMARY
  • This disclosure is directed to methods, systems, and computer program products for predicting potential collision zones of vehicles at prospective times using historical vehicle data, clearance information for one or more vehicles, and/or aerodrome guidance features. In some examples, a vehicle may transmit a current location of the vehicle to a user interface (UI) device (e.g., an electronic flight bag (EFB)) or to a remote data server (e.g., a cloud-based data server). The remote data server or EFB may predict the potential collision zone using one or more of the historical navigation route data, clearance information for one or more vehicles, and/or aerodrome guidance features and provide an indication of the potential collision zone to a user. The historical navigation route data may be based on transponder positional data and stored in a database of historical vehicle data. In addition, the aerodrome guidance features may include data stored in a database that provides information as to the location of guidance markings, such as guidance lines painted on a surface, guidance signs, building features, and other information that provide guidance to vehicles throughout a particular aerodrome location. A collision awareness system may predict routes using the historical navigation route data and aerodrome guidance features and predict vehicle positions along the route to determine potential collision zones. In some instances, the collision awareness system may provide the potential collision zone data for display on an EFB, such as on an airport moving map display (AMMD) application executing on the EFB.
  • In some examples, surface vehicle tracking systems may be used to determine airport surface transient object data using, for example, multilateration sensors or other airport system sensors. This data may be used to confirm or verify the prediction of the collision awareness system predicted potential collision zones using one or more of the historical navigation route data, clearance information for one or more vehicles, and/or aerodrome guidance features. Some transient ground objects or types of transient ground objects may not or may not be actively transmitting messages or signals that may be received by certain types of multilateration sensors or other airport system sensors, or may not respond to certain types of interrogation signals transmitted by multilateration sensors or other airport system sensors, such as if the airport system sensors use cooperative surveillance with which objects other than aircraft are not typically configured to cooperate. In some examples, a transient aircraft may be pulled via an aircraft tug (e.g., a tug vehicle that transports other vehicles). In such instances, the aircraft being pulled may be powered down at the time, such that the aircraft does not transmit signals that may be used to track the vehicle location. In addition, the vehicles may be in areas of an aerodrome that provide less guidance to vehicles via aerodrome guidance features. For example, an apron area of an airport may not include painted guidance features on the surface that may be referenced in an aerodrome guidance database. As such, complex maneuvering and high traffic areas in various aerodrome locations increases the likelihood of potential vehicle collision (e.g., wingtip collisions, etc.).
  • In accordance with various techniques of this disclosure, a collision awareness system may utilize one or more of the historical navigation route data and/or aerodrome guidance features to predict potential collision zones between vehicles. In addition, the collision awareness system may utilize vehicle clearance information, such as clearance information from an air traffic controller (ATC), to predict potential collision zones between vehicles traversing a surface, where at least one vehicle is moving, either by tug or not. In some examples, the collision awareness system may execute on a remote server that collects data, such as positions of vehicles, updates databases, and predict collision zones. In another example, the collision awareness system may execute at least in part on an EFB or other user interface device.
  • In some examples, the collision awareness system may receive clearance information in the form of text or voice, process the clearance information, and determine navigational information for a vehicle or predict, based on the clearance information, a current position of a vehicle. For example, if a vehicle receives clearance information to a particular gate of an apron area, but then powers down the avionics system of the vehicle, the collision awareness system may determine how much time has passed since the vehicle received the clearance information, how much time historically a vehicle would take to arrive at a destination point or another target mark on a path toward the destination point, and predict a location of the vehicle at any particular point in time. In any event, collision awareness system may use the historical navigation route data and the aerodrome guidance features to predict a location of the vehicle, predict a trajectory of the vehicle and predict trajectories of other vehicles to determine whether an overlap between an envelope of two or more vehicles indicates a potential collision at a prospective or future time.
  • In this way, a collision awareness system may be implemented without requiring additional hardware installation on an aircraft and may provide reliable indications as to predicted collision zones in an aerodrome by leveraging particular computing systems to overlay data, such as aerodrome surface data overlaid with historical navigation route data. In addition, the collision awareness system may leverage machine learning models to provide such predictions trained on particular data inputs that allow continuous modeling and updating of predicted routes as a vehicle traverses the route. For example, a collision awareness system may predict a route of a first vehicle, but as the first vehicle starts traveling the predicted route, may determine an updated predicted route of the first vehicle, such as based on data received from the vehicle (e.g., speed information, position information, etc.), thereby allowing the collision awareness system to provide dynamic predictions on the fly as objects are moving throughout the aerodrome and as historical navigation route data evolves with ever changing conditions. Furthermore, the collision awareness system may predict collision zones based on aircraft specifics and aerodrome specifics while referencing both general and specific information derived from multiple vehicle types and aerodrome locations.
  • In one example, a method includes obtaining, by processing circuitry of a ground collision awareness system, historical navigation route data for one or more reference vehicles, the historical navigation route data being based on transponder positional data. The method further includes identifying, by the processing circuitry, a plurality of aerodrome guidance features for a particular aerodrome location, the aerodrome guidance features including guidance marker information. The method further includes determining, by the processing circuitry, a predicted path of a first vehicle, the predicted path comprising a first portion and a second portion, the first portion of the predicted path being predicted using the guidance marker information, and the second portion of the predicted path being predicted using the historical navigation route data. The method further includes determining, by the processing circuitry, a predicted position of the first vehicle along the predicted path at a prospective time. The method further includes determining, by the processing circuitry, a predicted position of a second vehicle with respect to approximately the same prospective time. The method further includes performing, by the processing circuitry, a comparison of a first vehicle envelope for the first vehicle and a second vehicle envelope for the second vehicle at the predicted positions. The method further includes identifying, by the processing circuitry, an overlap of the first vehicle envelope and the second vehicle envelope. The method further includes and determining, by the processing circuitry, a predicted collision zone of the first vehicle and the second vehicle at the prospective time based at least in part on the overlap of the first vehicle envelope and the second vehicle envelope.
  • In another example, a ground collision awareness system comprising a processor and a memory is disclosed. The memory is configured to store: historical navigation route data for one or more reference vehicles, wherein the historical navigation route data are based on transponder positional data, and a plurality of aerodrome guidance features for one or more aerodrome locations, wherein the aerodrome guidance features include guidance marker information. The processor of the ground collision awareness system is configured to determine a predicted path of a first vehicle, the predicted path comprising a first portion and a second portion, the first portion of the predicted path being predicted using the guidance marker information, and the second portion of the predicted path being predicted using the historical navigation route data; determine a predicted position of the first vehicle along the predicted path at a prospective time; determine a predicted position of a second vehicle with respect to approximately the same prospective time; perform a comparison of a first vehicle envelope for the first vehicle and a second vehicle envelope for the second vehicle at the predicted positions; identify an overlap of the first vehicle envelope and the second vehicle envelope; and determine a predicted collision zone of the first vehicle and the second vehicle at the prospective time based at least in part on the overlap of the first vehicle envelope and the second vehicle envelope.
  • In another example, a non-transitory computer-readable storage medium having stored thereon instructions is disclosed. The instructions, when executed, cause one or more processors to: obtain historical navigation route data for one or more reference vehicles, the historical navigation route data being based on transponder positional data; identify a plurality of aerodrome guidance features for a particular aerodrome location, the aerodrome guidance features including guidance marker information; determine a predicted path of a first vehicle, the predicted path comprising a first portion and a second portion, the first portion of the predicted path being predicted using the guidance marker information, and the second portion of the predicted path being predicted using the historical navigation route data; determine a predicted position of the first vehicle along the predicted path at a prospective time; determine a predicted position of a second vehicle with respect to approximately the same prospective time; perform a comparison of a first vehicle envelope for the first vehicle and a second vehicle envelope for the second vehicle at the predicted positions; identify an overlap of the first vehicle envelope and the second vehicle envelope; and determine a predicted collision zone of the first vehicle and the second vehicle at the prospective time based at least in part on the overlap of the first vehicle envelope and the second vehicle envelope.
  • The disclosure is also directed to an article of manufacture comprising a computer-readable storage medium. The computer-readable storage medium comprises computer-readable instructions that are executable by a processor. The instructions cause the processor to perform any part of the techniques described herein. The instructions may be, for example, software instructions, such as those used to define a software or computer program. The computer-readable medium may be a computer-readable storage medium such as a storage device (e.g., a disk drive, or an optical drive), memory (e.g., a Flash memory, read only memory (ROM), or random access memory (RAM)) or any other type of volatile or non-volatile memory or storage element that stores instructions (e.g., in the form of a computer program or other executable) to cause a processor to perform the techniques described herein. The computer-readable medium may be a non-transitory storage medium.
  • The details of one or more examples are set forth in the accompanying drawings and the description below. Other features, objects, and advantages will be apparent from the description and drawings, and from the claims.
  • BRIEF DESCRIPTION OF DRAWINGS
    • FIG. 1 is a conceptual block diagram depicting an example collision awareness system interfacing with various components to determine predicted collision zones of vehicles, in accordance with aspects of this disclosure.
    • FIG. 2 is a conceptual block diagram for an example computing system with an example computer-executable collision awareness system, in accordance with aspects of this disclosure.
    • FIG. 3 depicts a flowchart for an example process that a collision awareness system may implement for identifying predicted collision zones, using one or more of the historical navigation route data and/or aerodrome guidance features, in accordance with aspects of this disclosure.
    • FIG. 4 depicts an example technique that a collision awareness system may implement for aligning historical navigation routes with aerodrome guidance features, in accordance with aspects of this disclosure.
    • FIGS. 5A-5C depict a conceptual diagram of a portion of an airport with various aircraft and ground vehicles on airport runways, taxiways, and other airport ground surfaces, and an example collision awareness system predicting the various aircraft and ground vehicle positions on the airport ground surfaces, in accordance with aspects of this disclosure.
    • FIG. 6 is a diagram of an example graphical output display of an airport ground surface map that may be implemented by a two-dimensional airport moving map display (2D AMMD) application that may be implemented on an electronic flight bag (EFB), e.g., on a tablet computer of a flight crew member in the cockpit of a particular aircraft on a ground surface of the airport, in accordance with aspects of this disclosure.
    • FIG. 7 is a conceptual block diagram depicting an example aerodrome network system with example ground surface sensors, in accordance with aspects of this disclosure.
    DETAILED DESCRIPTION
  • Various examples are described below generally directed to methods, computer program products, and electronic systems that may provide collision awareness data, including indications of potential collision zones, from a collision awareness system. The collision awareness system may provide such data over a wireless network to an application (e.g., an airport moving map display (AMMD) application) onboard an aircraft. For example, the collision awareness system may provide such indications of potential collision zones on an electronic flight bag (EFB), which may be implemented on a tablet computer or analogous user interface device. The flight crew may view and use the AMMD enhanced with the information from the collision awareness system while the pilot controls the aircraft on the airport ground surface, for example, during taxiing, parking, etc. In some examples, a tug operator may view and use the AMMD enhanced with the information from the collision awareness system while tugging an aircraft to a destination location according to ATC clearance information. The collision awareness system may determine potential collision zones with one or more other surface vehicles (e.g., other aircraft or ground vehicles) and transmit warnings of the potential collision zones to the EFB. Implementations of this disclosure may thus provide better situational awareness for controlling ground movement of aircraft on airport taxiways, including in weather conditions of limited visibility, without the need for any new hardware to be installed in the aircraft itself (and thus, without the need for new hardware to be certified by relevant aviation authorities), and without requiring cooperative participation by other aircraft. Implementations of this disclosure may not only decrease the possibility of collision of an aircraft with another aircraft or surface vehicle, but may also provide additional benefits for the airport, such as smoother taxiing and fewer interruptions or delays due to confusion or lack of situational awareness in the ground traffic.
  • FIG. 1 is a conceptual block diagram depicting example components of a collision awareness system environment 102. A collision awareness system may operate in such an example collision awareness system environment 102, including various example components of FIG. 1. In the illustrated example, the collision awareness system environment 102 is includes various components, including surface and/or flight vehicles 111, a traffic controller 114, one or more data server(s) 132, various databases or datastores 105, and user interface devices 104. As such, a collision awareness system may be implemented as software installed on one or more of the components of collision awareness system environment 102.
  • While vehicles 112A-N may be referred to at times as being airplanes of various configurations, the techniques of this disclosure are not so limited, and vehicles 112A-N may include other vehicles, such as helicopters, hybrid tilt-rotor aircrafts, urban air vehicles, jet, quadcopters, hovercrafts, space shuttles, uncrewed aerial vehicle (UAV), flying robots, etc. In addition, while vehicles 113A-N may be referred to at times as being tug vehicles, the techniques of this disclosure are not so limited, and vehicles 113A-N may include other vehicles, such as unmanned ground vehicles, transient ground surface vehicles, unmanned tug vehicles (e.g., remote control vehicles), luggage cart vehicles having multiple cars attached via linkages, refueler trucks, airport busses, container loaders, belt loaders, catering vehicles, emergency vehicles, snow removal vehicles, or ground maintenance equipment, etc. In some examples, vehicles 111 may receive direct communications from traffic controller 114, such as via radio or cellular communication. For example, traffic controller 114 may transmit clearance information directly to one of aircraft 112 or to a tug vehicle 113, indicating a destination port for parking an aircraft.
  • In some examples, user interface devices 104A-104N may include a wide variety of user interface devices. For example, user interface devices 104 may include tablet computers, laptop computers, phones, EFBs, augmented reality headsets or virtual reality headsets, or other types of user interface devices. User interface devices 104 may be configured to receive surface vehicle movement data with indications of potential collision zones from a collision awareness system. User interface device 104 may also be configured to generate (e.g., render) and present an AMMD that shows transient surface vehicles and indications of potential collision zones, in accordance with illustrative aspects of this disclosure, such as those of FIGS. 5A-5C.
  • Network 130 may include any number of different types of network connections, including satellite connections and Wi-Fi™ connections. For example, network 130 may include networks established using geosynchronous satellites 105A, low-earth orbit satellites 105B, global navigation satellite systems 105C, cellular base station transceivers 160 (e.g., for 3G, 4G, LTE, and/or 5G cellular network access), and/or Wi-Fi™ access points. In turn, the geosynchronous satellites 105A and low-earth orbit satellites 105B can communicate with gateways that provide access to network 130 for one or more devices implementing the collision awareness system. Cellular base station transceivers can have connections that provide access to network 130. In addition, global navigation satellite systems can communicate directly with vehicles 111, for example to triangulate (or otherwise compute) a current location of vehicles 111. These various satellite, cellular, and Wi-Fi network connections can be managed by different third-party entities, referred to herein as "carriers." In some examples, network 130 may include a wired system. For example, network 130 may include an ethernet system, such as a redundant ethernet system shown in FIG. 7 of this disclosure. In some examples, network 130 may include a multilateration system local area network (LAN), such as the multilateration system LAN shown in FIG. 7 of this disclosure.
  • In some examples, any one of devices of collision awareness system environment 102 executing one or more techniques of a collision awareness system may be configured to communicate with any one of the various components via network 130. In other instances, a single component of collision awareness system environment 102 may be configured to execute all techniques of the collision awareness system. For example, collision awareness system may include a system that resides on vehicles 111, data server(s) 132, traffic controller 114, or user interface devices 104A/104N. In some examples, collision awareness system may operate as part of a software package installed on one or more computing devices. For example, traffic controller 114 may operate software that executes one or more of the various techniques of the disclosed collision awareness system. For example, a software version of collision awareness system may be installed on a computing device of traffic controller 114. Likewise, the disclosed collision awareness system may be included with user interface devices 104 or one or more data server(s) 132. For example, data server(s) 132 may include a cloud-based data server that implements the disclosed collision awareness system.
  • In some examples, one or more data server(s) 132 may be configured to receive input data from network 130 (e.g., vehicle positional data, aerodrome guidance features, clearance information, etc.), determine a predicted collision zone, in accordance with one or more techniques of this disclosure, and may output predicted collision zone data to one or more components of FIG. 1, such as user interface devices 104, traffic controller 114, or vehicles 111. In some examples, data server(s) 132 may include datastores 105. In other examples, some or all of datastores 105 may be embodied as separate devices that interface with other components of collision awareness system environment 102 directly or via network 130. For example, where collision awareness system is implemented at least in part on one or more of data server(s) 132, datastores 105 may interface with data server(s) 132 directly or via network 130.
  • In some examples, the databases may include historical vehicle data 106, aerodrome guidance data 108, and in some instances, clearance data 110. Although shown as being a single datastore 105, the databases shown as part of datastore 105 may be embodied as separate objects. In some examples, a database included with a vehicle or external to the vehicle may be or include a key-value data store, such as an object-based database or dictionary. In a non-limiting example, a database may include any data structure (and/or combinations of multiple data structures) for storing and/or organizing data, including, but not limited to, relational databases (e.g., Oracle databases, MySQL databases, etc.), non-relational databases (e.g., NoSQL databases, etc.), in-memory databases, spreadsheets, as comma separated values (CSV) files, extensible markup language (XML) files, TeXT (TXT) files, flat files, spreadsheet files, and/or any other widely used or proprietary format for data storage. For example, the historical navigation route data may be laid out as individual, structured XML segments.
  • Databases are typically stored in one or more data stores. Accordingly, each database referred to herein (e.g., in the description herein and/or the figures of the present application) is to be understood as being stored in one or more data stores. In various examples, outgoing requests and/or incoming responses may be communicated in any suitable formats. For example, XML, JSON, and/or any other suitable formats may be used for API requests and responses or otherwise. As described herein, data transfer refers to both transmitting data from one of vehicles 111, traffic controller 114, data server(s) 132, or user interface devices 104 over network 130 and receiving data at user interface devices 104, data server(s) 132, traffic controller 114, or vehicles 111, over network 130. The data transfer may follow various formats, such as a database format, files, XML, HTML, RDF, JSON, a file format that is proprietary to the system, data object format, or any other format, and may be encrypted or have data of any available type.
  • In some examples, historical vehicle data 106 may store historical navigation route data, vehicle data (e.g., maintenance logs, safe zone envelope data, etc.). An example visual depiction of certain historical navigation route data may be as shown in Table 1 below. Table 1
    Speed Latitude Longitude Epoch time Real time
    10 41.97267 -87.89229 1496021663 1:34:23
    11 41.97268 -87.89236 1496021664 1:34:24
    10 41.97266 -87.8924 1496021665 1:34:25
    9 41.97268 -87.89246 1496021666 1:34:26
    13 41.9727 -87.89256 1496021667 1:34:27
    8 41.97268 -87.89257 1496021668 1:34:28
    10 41.97268 -87.89264 1496021669 1:34:29
    9 41.97266 -87.89269 1496021670 1:34:30
    11 41.97267 -87.89278 1496021671 1:34:31
    11 41.97265 -87.89284 1496021672 1:34:32
    The historical navigation route data for one data set may correspond to data obtained for a particular one of vehicles 111. For example, the above Table 1 may include data with respect to a particular one of vehicles 111, such as vehicle 112A or vehicle 113A. In addition, the historical navigation route data may be with respect to a particular location, such as a particular aerodrome location. As such, historical navigation route data may include additional data entries for 'vehicle IDs' and 'airport identifiers.' In any event, the above table is merely one example representation of certain historical navigation route data that database 105 may manage and store over time. The navigation route data may be based on data received directly from each aircraft, such as from transponder data, or may include tracking data obtained otherwise, such as through external sensors. In some examples, data entries related to 'speed' as shown may be related to ground speed. Historical vehicle data 106 may store speed in any suitable unit, such as nautical miles per hour, meters per second, etc. Historical vehicle data 106 may also store acceleration data as determined from the velocity data or as received directly from one of vehicles 111 or external sensors.
  • In some examples, aerodrome guidance data 108 may include maps or other data representations of a ground surface of an airport, including guidance features configured to guide a vehicle through a particular aerodrome location. The ground surface can be, for example, a taxiway, runway, gate area, apron, hangar bays, or other trafficway or ground surface of an airport. For purposes of this disclosure, description of an "airport" may apply equivalently to an airbase, an airfield, or any other type of permanent or temporary aerodrome. In some examples, aerodrome guidance data 108 may include multiple databases specific to a particular aerodrome or multiple aerodromes within a certain vicinity. For example, aerodrome guidance data 108 may include a fixed ground object database specific to the airport, including real-time or recent-time imaging or detection, or a combination of the two, to provide fixed ground object information and provide aerodrome guidance features, such as guidance line coordinates fixed to the surface of an aerodrome. In any event, data server(s) 132 or other components of collision awareness system environment 102 may be configured to access one or more of aerodrome guidance data 108. For example, data server(s) 132 may identify a particular aerodrome location, such as a particular airport in a particular city, and access aerodrome guidance data 108 specific to the identified aerodrome location. In some examples, aerodrome guidance data 108 for multiple aerodromes may be included in a single datastore 105, rather than in separate datastores 105 as may be the case in some examples.
  • In some examples, datastore 105 may further include clearance data 110. In some examples, clearance data 110 may be included as a separate datastore 105. For example, clearance data 110 may reside with a datastore stored on a computing system of traffic controller 114. For example, traffic controller 114 of a particular aerodrome may include clearance data 110. Traffic controller 114 may further include other data included with datastores 105. Clearance data 110 may include text or audible clearance information generated by traffic controller 114 and/or vehicle 111, as with communications between traffic controller 114 and a receiving vehicle 111.
  • In some examples, traffic controller 114 may transmit taxiway or runway clearance information to one of vehicles 111 in either text format or voice message. In some examples, one of vehicles 111 may retrieve the taxiway or runway clearance information from traffic controller 114. For example, one of vehicles 111 may perform a database query for clearance data 110 or otherwise request clearance data 110 from traffic controller 114 or a datastore 105 storing clearance data 110. In some examples, the text or voice message may be directly transmitted to one of vehicles 111 from traffic controller 114 (e.g., live communication or from a clearance database 110). One of vehicles 111 may then transmit the clearance information to one or more external systems (e.g., cloud systems) via an Aircraft Data Gateway Communication Unit (ADG). For example, vehicle 111 or traffic controller 114 may transmit clearance information to a device executing the collision awareness system.
  • In various scenarios, traffic controller 114 may send a duplicate copy of clearance message (text or voice message) to data server(s) 132 (e.g., a cloud system) via a secured communication protocol. Once the data is available on data server(s) 132, data server(s) 132 may convert any voice related taxiway or runway clearance information to text information and store the clearance information to a predefined location of datastore 105.
  • A collision awareness system implemented on one or more components of collision awareness system environment 102 may utilize data from datastores 105 to determine predicted collision zones of vehicles. In this way, the collision awareness system may help mitigate or reduce collisions between vehicles 111 (involving body, wingtip, or other portion of vehicles 111) and other aircraft, ground vehicles, or other transient or moving objects on an airport ground surface (collectively, "transient surface objects") while aircraft 112 is taxiing, taking off, landing, or stationary, on the airport ground surface. For purposes of this disclosure, "transient surface objects" may refer to any aircraft, ground vehicles, or other objects on airport ground surfaces, including objects that are permanently fixed in place, and that a collision awareness system may monitor.
  • FIG. 2 is a conceptual block diagram for an example computing system 138 with an example computer-executable collision awareness system 140. As described with reference to FIG. 1, collision awareness system 140 may be embodied in any number of different devices, such as one or more of the components of collisions awareness system environment 102 described with reference to FIG. 1. For sake of simplicity, computing system 138 implementing collision awareness system 140 may be described as executing various techniques of this disclosure across one or more data server(s) 132, such as executing on a cloud server. It will be understood, however, that computing system 138 may be implemented on traffic controller 114, user interface device(s) 104, vehicles 111, or other network devices designed to provide vehicle collision awareness. That is, collision awareness system 140 may execute on any one or more of processing circuitry 142 of computing devices corresponding to a traffic controller 114, user interface device(s) 104, vehicles 111, or other network devices, and combinations thereof. In addition, collision awareness system 140 may execute based on data from storage device(s) 146 included with any one or more of processing circuitry 142 of computing devices corresponding to a traffic controller 114, user interface device(s) 104, vehicles 111, or other network devices, and/or data stores 105, in cases where one or more of databases 106, 108, or 110 are implemented as storage devices separate from storage device(s) 146. In some examples, storage device(s) 146 may include one or more of databases 106, 108, or 110.
  • As such, computing system 138 may implement collision awareness system 140 via processing circuitry 142, communication circuitry 144, and/or storage device(s) 146. In some examples, computing system 138 may include display device 150. For example, where computing system 138 is embodied in one of user interface device 104, vehicles 111, or traffic controller 114, computing system 138 may include a display device 150 that is integral to the particular device. In some examples, display device 150 may include any display device, such as a liquid crystal display (LCD) or a light emitting diode (LED) display or other type of screen, with which processing circuitry 142 may present information related to predicted collision zones. In some examples, display device 150 may not be included with computing system 138. For example, computing system 138 may be one of data server(s) 132 configured to perform various techniques of this disclosure and transmit to another device, such as one of user interface devices 104, collision zone data for display.
  • In examples including display device 150, display device 150 may configure collision zone information graphically rendered on a ground navigation application implemented by the aircraft system. In addition, display device 150 may configure position/velocity information for one or more transient surface objects to be graphically rendered on a ground navigation application implemented by the aircraft system. For example, the display device may generate graphical display format data based on the position and velocity information configured compatibly with the graphical outputs of an AMMD application, such that an AMMD application may overlay, superimpose, or otherwise integrate graphical display format data with existing AMMD graphical display outputs. Display device 150 generates outputs including or in the form of the graphical display format data, such that the outputs may be readily configured to be received and graphically rendered by an AMMD application executing on an EFB (e.g., on a tablet computer) in the cockpit of an aircraft in motion on the airport ground surfaces, as further described below. In this way, collision awareness system 140 may provide outputs, including alerts or warnings, that may be immediately available, via display device 150, to inform pilots or other flight crew of a potential hazard of an impending collision, such that the pilot or flight crew can take appropriate action.
  • In some examples, display device 150, including one or more display processors, may be incorporated in a single processor, electronic system and/or device, or software system with an integrated implementation of collision awareness system 140, in an integrated collision avoidance logic and display processing subsystem. For example, user interface device 104 may include collision awareness system 140 and display device 150 as a single device, such as an EFB.
  • In some examples, processing circuitry 142 may include fixed function circuitry and/or programmable processing circuitry. Processing circuitry 142 may include any one or more of a microprocessor, a controller, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or equivalent discrete or analog logic circuitry. In some examples, processing circuitry 142 may include multiple components, such as any combination of one or more microprocessors, one or more controllers, one or more DSPs, one or more ASICs, or one or more FPGAs, as well as other discrete or integrated logic circuitry. The functions attributed to processing circuitry 142 herein may be embodied as software, firmware, hardware or any combination thereof.
  • In some examples, communication circuitry 144 may include a wireless network interface card (WNIC) or other type of communication module. In some examples, communication circuitry 144 may have an Internet Protocol (IP) port coupled to an ethernet connection or to an output port, such that communication circuitry 144 receives outputs from processing circuitry 142. Communication circuitry 144 may be configured to connect to a Wi-Fi™ or other wireless network connection. In some examples, communication circuitry 144 may be separate from collision awareness system 140. For example, collision awareness system 140 may include processing circuitry 142 of computing system 138, whereas communication circuitry may be included as a separate computing system.
  • In some examples, collision awareness system 140 may include one or more storage device(s) 146. In some examples, storage device(s) 146 may include one or more of datastores 105 and may be similarly configured to store data. For example, storage device(s) 146 may include any data structure (and/or combinations of multiple data structures) for storing and/or organizing data, including, but not limited to, relational databases (e.g., Oracle databases, MySQL databases, etc.), non-relational databases (e.g., NoSQL databases, etc.), in-memory databases, spreadsheets, as comma separated values (CSV) files, extensible markup language (XML) files, TeXT (TXT) files, flat files, spreadsheet files, and/or any other widely used or proprietary format for data storage.
  • In some examples, storage device(s) 146 may include executable instructions that when executed cause processing circuitry 142 to execute various techniques of this disclosure. In addition, storage device(s) 146 may include machine learning (ML) model(s) 148. In some examples, ML model(s) 148 may be included on a separate storage device. For example, ML model(s) 148 may be stored in datastore 105 on data server(s) 132. In such examples, processing circuitry 142 may execute ML model(s) 148 via network connection 130.
  • In some examples, a trained ML model can be used to process and predict paths, vehicle positions, or collision zones in accordance with certain examples of this disclosure where ML models are considered advantageous (e.g., predictive modeling, inference detection, contextual matching, natural language processing, etc.). Examples of ML models that may be used with aspects of this disclosure include classifiers and non-classification ML models, artificial neural networks ("NNs"), linear regression models, logistic regression models, decision trees, support vector machines ("SVM"), Naive or a non-Naive Bayes network, k-nearest neighbors ("KNN") models, k-means models, clustering models, random forest models, or any combination thereof. These models may be trained based on data stored in datastores 105. For example, certain aspects of the disclosure will be described using predicted paths generated from a ML model trains on data from datastores 105, for purposes of illustration only.
  • A ML system or pattern recognition system may be accessed or incorporated by collision awareness system 140. ML model(s) 148 may incorporate knowledge of the predictable future motions of aircraft, ground vehicles, or other objects based on statistical training of one or more ML model(s) 148 or pattern recognition system based on large training data sets of past motions of aircraft, ground vehicles, and other objects as statistically sampled over a specific airport or a representative collection of airports over time. Such a ML system or pattern recognition system may also incorporate statistical training of observed motions of aircraft, ground vehicles, and other transient object on airport ground surfaces as correlated with a variety of conditions such as traffic levels, weather and visibility conditions, and time of day, for example. One or more initially trained ML model(s) 148 may be further refined with a large corpus of data of motions of aircraft, ground vehicles, and other transient object on airport ground surfaces in comparison with motions predicted by the one or more ML model(s) 148. In addition, collision awareness system 140 of computing system 138 may implement an expert rules system that may incorporate knowledge of general airline gate assignments, specific gate assignments for a particular aircraft or a given flight, and data on assigned taxiing routes between gate areas and runways, that ML model(s) 148 may use to predict routes. For example, processing circuitry 142 may deploy a ML model 148 trained on historical navigation route data from historical vehicle data 106 and trained on general airline gate assignments to predict routes of one of vehicles 111, including worst case and best case scenario routes that may be combined to determine a single predicted route, in accordance with techniques of this disclosure.
  • In some examples, collision awareness system 140 may be enabled and implemented in existing airport systems, vehicles 111, and/or user interface devices 104, with only minimal hardware or software changes. In addition, in some examples, collision awareness system 140 may be configured to provide credible false alarm mitigation as well as the capability to use a variety of data inputs such as from Automatic Dependent Surveillance - Broadcast (ADS-B) sources, and provide coverage of any types of vehicles 111 that might potentially collide with a fixed structure or another one of vehicles 111.
  • FIG. 3 depicts a flowchart for an example process 300 that collision awareness system 140 may implement for providing collision zone predictions, in accordance with illustrative aspects of this disclosure. Process 300 may include some features that may be optional in some examples. In this example, process 300 includes obtaining (e.g., by processing circuitry 142 of collision awareness system 140) historical navigation route data for one or more vehicles 111 (302). For example, processing circuitry 142 may identify historical navigation route data from historical vehicle data 106. The historical navigation route data may be based at least in part on transponder positional data from vehicles 111. For example, vehicles 111 may transmit, via network 130, navigation route data that may be stored over time as historical navigation route data in historical vehicle datastore 106.
  • Process 300 further includes identifying (e.g., by processing circuitry 142) a plurality of aerodrome guidance features for a particular aerodrome location (304). For example, processing circuitry 142 may identify aerodrome guidance features for a particular airport from aerodrome guidance data 108. In some examples, the aerodrome guidance features may include guidance marker information, such as guidance signs and guidance lines of an aerodrome location. For example, guidance lines may include coordinates of line markings fixed to an aerodrome surface, such as by being painted to a ground surface).
  • Process 300 further includes determining (e.g., by processing circuitry 142) a predicted path of a first vehicle (306). In some examples, the predicted path may include a first portion of the predicted path and a second portion of the predicted path. For example, the first portion of the predicted path may include areas of an aerodrome including guidance features, such as on a taxiway or runway, whereas the second portion of the predicted path may include areas of an aerodrome that do not include defined guidance features or areas for which guidance features are unavailable from a database. In such examples, processing circuitry 142 may predict the first portion of the predicted path using the aerodrome guidance features, such as particular guidance marker or guidance signs information, and may predict the second portion of the predicted path using the historical navigation route data from datastore 106. In addition, processing circuitry 142 may predict the first portion of the predicted path using historical navigation route data from datastore 106. For example, processing circuitry 142 may receive information from a particular one of vehicles 111 regarding speed information of vehicle 111 (e.g., the current speed of an aircraft). In such examples, processing circuitry 142 may align the historical path points (e.g., as described with reference to Table 1), speed, and time parameters with surface guidance lines included with aerodrome guidance data 108. The historical navigation route data may indicate how long on average a vehicle takes to travel from one position along a guidance line to another position along the guidance line depending on various factors, such as time of day, vehicle size and weight, traffic flow information, etc. As such, processing circuitry 142 may estimate various ahead positions along the guidance lines based on the historical navigation route data. An illustrative example of processing circuitry 142 using historical navigation route data aligned along (e.g., mapped to) an example aerodrome guidance feature is described below with reference to FIG. 4.
  • In some examples, the predicted path of the first vehicle is based on a combination of initially predicted paths of the first vehicle, the initially predicted paths including a likelihood of the first vehicle traveling a first initially predicted path and a likelihood of the first vehicle traveling a second initially predicted path. For example, processing circuitry 142 may combine or average a predicted path representing a plurality of best case historical paths and worst case historical path segment to generate the predicted path. The predicted path may be specific to a particular one of vehicles 111 and may include a path connecting points for the particular one of vehicles 111 from a first time (e.g., A1(t1)) to one or more other prospective times (e.g., A1(t1 + n), where n represents an integer to be added to t representing a first time).
  • In such examples, processing circuitry 142 may determine at least two initially predicted paths of the first vehicle. The processing circuitry 142 may also identify a likelihood of the first vehicle traveling any of the initially predicted paths. For example, the likelihood may include an indication of a best case path, meaning that the predicted path is most likely to occur, or an indication of a worst case path, meaning that the predicted path is least likely to occur, or other paths that have likelihoods that fall between best case and worse case paths. In any event, processing circuitry 142 may determine the predicted path of the first vehicle based on a combination of at least two initially predicted paths of the first vehicle. The combination may be based on processing circuitry 142 deploying a ML model able to determine initially predicted paths, combine predicted paths, or both to determine a combined predicted path. In some instances, the combination may be based on the likelihood information, such that more weight is provided paths that are more likely to occur under the circumstances and less weight is provided paths that are less likely to occur under the same circumstances. For example, processing circuitry 142 may determine a weighted average of initially predicted paths or deploy one of ML model(s) 148 to determine a weighted average or other combinations of initially predicted paths.
  • In some examples, processing circuitry 142 may classify and transfer the predicted path data to datastore 106. Processing circuitry 142 may classify the predicted path based on aircraft type, such that the predicted path may be referenced for future use with similarly situated vehicles 111.
  • In some examples, processing circuitry 142 may retrieve certain historical data from historical vehicle data 106, to identify a predicted path based on a particular vehicle type and/or current state information of the vehicle. For example, the particular vehicle may be an ownship executing the techniques of this disclosure on an EFB located in the cockpit of the ownship. In such examples, processing circuitry 142 of the EFB may select the best suitable historical data for determining a predicted path. In some examples, the selected best suitable historical data may be dynamic from location-to-location for identifying predicted paths using the ownship parameters. In some examples, ML models may be used for best suitable path selection based on the current data and historical data.
  • In some examples, processing circuitry 142 may identify clearance information of a traffic controller defining one or more target marks for the first vehicle 111. For example, processing circuitry 142 may query a database for clearance information. The clearance information may include a destination location as one target mark for the first vehicle but may also include multiple target marks along the way to the destination location, such that a vehicle will follow the path along the target mark to reach the destination location.
  • In some examples, processing circuitry 142 may identify one or more target aiming features from the plurality of aerodrome guidance features based at least in part on the clearance information. In such examples, the one or more target aiming features may be configured to provide guidance through a particular part of an aerodrome toward the one or more target marks. For example, the target aiming features may include aerodrome guidance features that aim or steer a vehicle toward a target, such as by providing arrows, whether virtual or real, that guide vehicle 111 to a target. As such, processing circuitry 142 may identify the first portion of the predicted path using the one or more target aiming features. For example, the first portion may include aerodrome guidance features, such that historical navigation data and aerodrome guidance features may be used in conjunction with one another to determine a predicted path of a vehicle 111 through the first portion of the predicted path. In such examples, processing circuitry 142 may identify the second portion of the predicted path using the clearance information and historical navigation route data, the second portion of the predicted path including a destination location of the first vehicle defined by the clearance information as a target mark of the one or more target marks. The second portion of the predicted path may be through an apron area of an aerodrome that does not include guidance features and thus, historical navigation data may be used to predict the portion of the path. It should be noted that the first portion and the second portion may be switched in some examples, such as when a vehicle is leaving an apron or gate area toward a runway. That is, in some examples, the second portion of the predicted path may include an airport apron region or may include a taxiway comprising surface guidance markings, depending on which direction the vehicle 111 is destined to travel (e.g., toward a gate, toward a runway, toward a hangar bay, or somewhere between, etc.).
  • Process 300 further includes determining (e.g., by processing circuitry 142) a predicted position of the first vehicle along the predicted path at a prospective time (308). In some examples, processing circuitry 142 may implement regression algorithms to predict an immediate accurate position using the previous position, speed, and heading information (e.g., A1(t + 1) to A1(t + 2), where time t is in seconds). In some scenarios, the regression models are used to minimize position deviation error in the historical data. In some examples, data points, such as those shown in Table 1 above, may be used to calculate a cumulative distance of a particular vehicle 111. For example, processing circuitry 142 may calculate the cumulative distance from A1(t1) to A1(t + n). In such examples, processing circuitry 142 may determine all intermediate path points of a path segment from time 't' to 't + n', assuming all intermediate points are locally linear. In some examples, processing circuitry 142 may utilize a function, such as a great circle distance formulae, great circle earth model, or an equivalent projection system formulae to determine position information based on the calculated distance and direction. Processing circuitry 142 may determine the directionality of one of vehicles 111 from the predicted positions along the predicted path. As shown in FIGS. 5A-5C, A1(t1) may be determine based on the function using the current position of the particular vehicle 111 and a cumulative distance according to equations [1] or [2] below.
  • In such examples, processing circuitry 142 may determine movement information of the first vehicle at a current position. In some examples, the movement information may include speed information of the first vehicle. For example, processing circuitry 142 may receive sensor data or transponder data from one of vehicles 111 indicating a rate at which the particular vehicle 111 is traveling. As such, processing circuitry 142 may identify the predicted position of the first vehicle using the movement information and the historical navigation route data. For example, processing circuitry 142 may determine how much distance the particular vehicle 111 will travel along a predicted path based on the rate at which the particular vehicle 111 is traveling at a current position along the predicted path.
  • Process 300 further includes determining (e.g., by processing circuitry 142) a predicted position of a second vehicle with respect to approximately the same prospective time (310). For example, processing circuitry 142 may determine a predicted position of the first one of vehicles 111 at a time 15 seconds in the future and thus, may determine a predicted position of another one of vehicles 111 at a time 15 seconds in the future. In essence, processing circuitry 142 may determine a predicted position of each vehicle at any number of prospective times, and processing circuitry 142 will likely find a time or range of times in the future that represent when a vehicle collision is likely to occur (e.g., 14-15 seconds in the future). In some examples, the prospective time corresponding to the predicted position of the second vehicle may be the same as the prospective time corresponding to the predicted position of the first vehicle. For example, the predicted position for both vehicles may correspond to a prospective time of 15 seconds in the future. However, in some instances, the predicted positions may not correspond to the exact same prospective times. For example, due to the size of a vehicle a predicted position at T = 14 seconds and predicted position at T = 15 seconds may indicate an overlap of vehicle envelopes at either 14 or 15 seconds. In another example, the predicted positions may be determined at different intervals. For example, the predicted position for a first vehicle may be determined on a second-by-second basis, whereas the predicted position for a second vehicle may be determined on a half-second or every other second basis. In such instances, processing circuitry 142 may perform interpolation techniques to predict collision zones at times that are approximately the same (e.g., within a half second or seconds of one another), but may not be exactly the same.
  • Process 300 further includes performing (e.g., by processing circuitry 142) a comparison of vehicle envelopes for the first vehicle and the second vehicle at the predicted positions (312). For example, processing circuitry 142 may retrieve vehicle envelope data from historical vehicle data 106. The vehicle envelope data may include safe zone envelopes for a single vehicle 111 or multiple vehicles 111, such as in the case of an aircraft 112A being towed by a vehicle tug 113A.
  • Process 300 further includes identifying (e.g., by processing circuitry 142) an overlap of vehicle envelopes (314). In some instances, processing circuitry 142 may predict a position of one vehicle 112A as turning toward a stationary vehicle 112B. Processing circuitry 142 may determine that through the turn, a safe zone envelope of vehicle 112A will overlap with stationary vehicle 112B and thus, may identify a predicted collision zone. That is, process 300 may be concluded after determining (e.g., by processing circuitry 142) a predicted collision zone of the first vehicle and the second vehicle at the prospective time based at least in part on the overlap of the vehicle envelopes (316). In the above example process, one or both of the vehicles may be powered off, such that avionics are not continuously running so as to provide updates on position of vehicles 111. As such, processing circuitry 142 uses the techniques and data described in process 300 to predict movement of vehicles 111, where an exact location may not be available for the vehicle 111 based on sensor or transponder data.
  • In some examples, processing circuitry 142 executing process 300 may include processing circuitry 142 of one or more remote data server(s) 132. In such examples, processing circuitry 142 of the one or more remote data server(s) 132 may receive a current position of the first vehicle 111 or an indication of the current position. Processing circuitry 142 may execute all or some of process 300 in order to determine a predicted collision zone. In some examples, processing circuitry 142 of the remote server may transmit the predicted collision zone from the remote server to the first vehicle 111, such as to an EFB or other user interface device 104 that corresponds to vehicle 111.
  • FIG. 4 depicts an example technique that collision awareness system 140 may implement for aligning historical navigation routes with aerodrome guidance features when determining predicted positions of vehicles 111. In some examples, processing circuitry 142 may align historical navigation routes so as to coincide with aerodrome guidance features, such as surface guidance lines. In an example, processing circuitry 142 may predict a path for a first vehicle 111 from an interim source point 404A to an interim destination point 404B (e.g., target marks). The source and destination points 404 may be in an area of an aerodrome having aerodrome guidance features. For example, a surface guidance line 406 may be between source and destination points 404. In some examples, processing circuitry 142 may determine source and destination points 404 based on various predicted points along a predicted path, where the predicted path may be updated as vehicle 111 approaches each predicted point along the predicted path. In some examples, the source and destination points 404 may be based on vehicle clearance data received from traffic controller 114. In some examples, processing circuitry 142 may utilize a combination of clearance data and historical navigation route data to determine points 404A and 404B configured to guide vehicle 111 to an ultimate destination point that may deviate from areas of the aerodrome having surface guidance lines, such as surface guidance line 406.
  • As shown in FIG. 4, processing circuitry 142 may determine historical navigation route data 410 between points 404A and 404B. Processing circuitry 142 may determine the historical navigation route data 410 from aerodrome guidance data 108. In some examples, historical navigation route data 410 of FIG. 4 may be a combination (e.g., an average) of a plurality of predicted paths, combined into a single predicted path 410 made up of various predicted points along predicted path 410 (e.g., based on weighted averages based on likelihood of each predicted path).
  • Processing circuitry 142 may align historical data points 410 along a predicted path between points 404A and 404B to determine an aligned predicted path 412. The aligned predicted path 412 may be aligned along surface guidance features, such as surface guidance line 406. Processing circuitry 142 may use aligned predicted path 412 to determine predicted points (e.g., target marks) along the aligned predicted path 412 at prospective times in order to determine collision zones in accordance with various techniques of this disclosure. It will be understood that target marks refer to various points on a surface that a vehicle may target as the vehicle proceeds along a path in order for the vehicle to navigate to a final target mark or final destination. Target marks may change over time, e.g., as processing circuitry 142 updates the predicted path. Processing circuitry 142 may further update predicted positions along the updated predicted paths over time, e.g., based on changes in vehicle speed.
  • FIGS. 5A-5C depict conceptual diagrams of a portion of an aerodrome 500 with various aircraft and ground vehicles on airport runways, taxiways, and other aerodrome ground surfaces. The various aircraft and ground vehicles shown in FIGS. 5A-5C include aircraft 112 and ground vehicles 113 (designated in FIGS. 5A-5C as A1-A4 for simplicity). As discussed above with respect to FIG. 1, collision awareness system 140 may be configured to determine predicted paths and positions of vehicles 111, actual positions and velocities of the vehicles 111, determine alert envelopes, and predict collision zones of vehicles 111. Collision awareness system 140 may then output the position and velocity information for the one or more transient surface objects and indications of potential collision zones to network 130, such that these outputs from collision awareness system 140 may be received by an EFB on at least one of the vehicles 111 among the transient surface objects on the ground surfaces of aerodrome 500.
  • FIG. 5A illustrates a simplified example of two vehicles A1 and A2, which may be aircraft of vehicles 112A-112N, but will be referred to as vehicles A1 and A2 for simplicity in illustrative a time progression using tX indicators. In the examples of FIGS. 5A-5C, tO-tX indicate time in seconds. For example, t0 indicates an initial starting point of time 0, whereas t5 indicates a predicted position after 5 second has passed. In FIG. 5A, vehicle A1 has received clearance information to park at a particular destination location. Processing circuitry 142 may predict a path 502 of vehicle A1 in accordance with various techniques of this disclosure. For example, processing circuitry 142 may deploy a ML model to determine a combined predicted path determined from best case and worst case path predictions as informed by historical navigation route data at aerodrome 500 or other aerodromes. Processing circuitry 142 may determine a predicted position of vehicle A1 along predicted path 502 at any time interval. While the examples of FIGS. 5A-5C show 5 second intervals, the techniques of this disclosure are not so limited, and any time interval may be used including a variable time interval. In the example of FIG. 5A, four predicted positions are predicted for vehicle A1 at 5 seconds, 10 seconds, 15 seconds, and 30 seconds. Processing circuitry 142 may determine the predicted positions using historical navigation route data, aligned with aerodrome guidance features where available, or in some examples, may use historical navigation route data without aerodrome guidance features where the aerodrome guidance features are unavailable, such as in areas of aerodrome 500 where guidance lines are nonexistent (e.g., apron area, a gate area, etc.).
  • Processing circuitry 142 may further predict a predicted path and position of vehicle A2. In some examples, vehicle A2 may be an aircraft that received clearance information from traffic controller 114 indicating one or more target marks for vehicle A2. For example, processing circuitry 142 may identify a predicted path of the second vehicle using a predicted current position of the second vehicle. In such examples, processing circuitry 142 may predict the current position of the second vehicle based on the historical navigation route data. For example, processing circuitry 142 may deploy a ML model trained on historical navigation route data, aerodrome guidance features, and/or clearance information to determine the predicted path. The ML model may identify patterns in the historical navigation route data, aerodrome guidance features, and/or clearance information that indicate a predicted path that a vehicle is likely to take toward a target mark or target destination location.
  • As such, processing circuitry 142 may predict positions of vehicle A2 along predicted route 503, the predicted positions including at least one time that coincides with a predicted position time with respect to vehicle A1 (e.g., t10, t15, t30). In some examples, processing circuitry 142 may determine the predicted position of the second vehicle using the predicted current position of the second vehicle and one or more of: the historical navigation route data, the plurality of aerodrome guidance features, or clearance information for the second vehicle. In the example of FIG. 5A, processing circuitry 142 may determine that an overlap of safe zone envelopes for both vehicles A1 and A2 will occur at a prospective time of 15 seconds in the future, unless certain changes are made to the system, such as a slowing or speeding up of one or the other vehicle.
  • Processing circuitry 142 may perform another prediction at various intervals using new information as the information becomes available, such as velocity or acceleration data of vehicles A1 and A2. In any event, processing circuitry 142 may identify the predicted collision zone as the area of overlap 508. While safe zone envelopes are shown as a circle in FIGS. 5A-5C, safe zone envelopes may be any shape and may be specific to the shape of the particular vehicle 111. For example, where vehicle 112A (e.g., A1) is a particular aircraft having a particular size and shape, the safe zone envelope for vehicle 112A may resemble the size and shape of vehicle 112A, such that the detected overlap will indicate where on the vehicle the overlap is predicted to occur. In the example of FIG. 5A, processing circuitry 142 may determine the overlap of envelopes 508 indicates that the nose of vehicle A2 is predicted to collide with the left wing of vehicle A1 at a prospective time of t15.
  • If there are any collision zones ahead to the ownship, processing circuitry 142 generates a visual or text notification for display on user interface device 104. As an example, the below equation [1] is used for computing the cumulative distance along the track from time 't' to time 't + n' where 't' is in seconds and 'n' is a positive integer value. distance = u t + 1 2 a t 2
    Figure imgb0001

    where 'u' refers to the velocity or speed, 't' refers to the change in time and 'a' refers to the acceleration (change in speed).
  • As shown in Figure-3, historical performance/track data has speed and velocity information. The acceleration is calculated with speed difference at each position. If the speed is constant (i.e., 'a'=0), then equation (1) (above) reduces to equation [2] (below): distance = u t
    Figure imgb0002
  • In the example of FIG. 5B, vehicle A3 represents an aircraft tug vehicle pulling an aircraft. Processing circuitry 142 may predict a path of vehicle A3 and determine that vehicles A2 and A3 are not predicted to collide because at 5 seconds, vehicle A1 is predicted to be beyond the intersection point of the predicted path of vehicles A1 and A3. In the example of FIG. 5B, processing circuitry 142 may predict a first portion of a predicted path of vehicle A3 using historical navigation route data, such as in the apron area of aerodrome 500. Processing circuitry 142 may predict a second portion of predicted path of vehicle A3 using historical navigation route data and aerodrome guidance features, such as in an area of aerodrome 500 that includes guidance features that vehicles 111 are expected to follow to reach a predefined destination.
  • The example of FIG. 5C is similar except that a vehicle A4 is predicted to be located near the path of A3. In the example of FIG. 5C, vehicle A4 may be a parked vehicle that has the avionics system turned off. As such, processing circuitry 142 may use historical navigation route data to determine the predicted position of vehicle A4 and determine a predicted path of vehicle A4. In this example, vehicle A4 does not have a predicted path in the foreseeable future based on clearance information, historical navigation route data, and/or other aircraft information available to processing circuitry 142, such as flight times relating to vehicle A4, etc. In such examples, processing circuitry 142 may determine whether vehicle A3 will have enough clearance to follow along a predicted path without clipping parked vehicle A4. In some examples, processing circuitry 142 may determine a predicted collision zone between vehicles A3 and A4 and provide a notification for display on one of user interface devices 104.
  • FIG. 6 is a diagram of an example graphical output display 610 of an airport ground surface map that may be implemented by a two-dimensional airport moving map display (2D AMMD) application that may be implemented on an EFB. For example, a 2D AMMD application may be implemented on one of user interface devices 104, such as an EFB tablet computer in the cockpit of a particular aircraft (e.g., aircraft 112 in FIG. 1), e.g., while the aircraft is on the ground and, for example, taxiing or otherwise moving. In other examples, the 2D AMMD application may be implemented on another device, other than or in addition to, one of user interface device(s) 104. In other examples, a graphical output display analogous to graphical output display 610 may be implemented by a three-dimensional AMMD application that may be implemented on an EFB or other application package executing on a tablet computer or other type of computing and display device.
  • The AMMD application graphical output display 610 (or "AMMD 610") includes representations (e.g., graphical icons) of transient surface vehicles that may be received and/or decoded by a transient surface object overlay module of an AMMD application executing on user interface device 104 that provides AMMD 610. AMMD 610 thus includes a graphical icon of an ownship 612 (e.g., that may correspond to vehicles 111 in FIG. 1 and one of vehicles A1-A4 in FIGS. 5A-5C, in the cockpit of which user interface device 104 is being used); graphical icons of other moving vehicles 614, 616, and 618 (e.g., corresponding to other vehicles 111); and graphical icons of ground vehicles 622, 624 (e.g., corresponding to ground vehicles 113), in the display example shown in FIG. 6.
  • AMMD 610 also includes representations of aerodrome guidance features, such as surface guidance markers 642 and 644. AMMD 610 may also include representations of taxiways 634, 636, and 638, and apron areas 652, 654, and 656 near airport terminal building portions 662, 664, and 666. AMMD 610 may include indications of potential collision zones provided by collision awareness system 140, such as warning graphic 670 and textual warning notice 672 between ownship icon 612 and aircraft icon 614, based on a predicted collision zone determined and transmitted by collision awareness system 140. In one example, ownship 612 may have a predicted route leading from between surface guidance markers 642 to apron area 656. In such instances, a first portion of the predicted route may include portions of taxiway 634 and a second portion of the predicted route may include portions of apron area 656. In any event, the first portion may correspond to areas of the particular aerodrome location that include aerodrome guidance features, such as surface guidance markers, whereas the second portion may correspond to areas of the particular aerodrome location that do not include aerodrome guidance features, such as apron areas.
  • In some examples, vehicle 614 and/or vehicle 612 may not be physically present at the position shown on AMMD 610. That is, AMMD 610 may display predicted positions of vehicle 614 and/or vehicle 612 at or near predicted collision zone 670. AMMD 610 may also display predicted routes along with current positions and one or more predicted positions over time. In another example, AMMD 610 may display one or more predicted positions of second vehicle 614, such that a user may view the predicted route and predicted positions contributing to the predicted collision zone of ownship 612 and second vehicle 614. The user may toggle on and off various aspects of displayed information, such as toggling on or off predicted positions of. In one example, predicted positions and/or predicted routes may be displayed as holograms or otherwise, faint depictions, of vehicle movement or stationary location of a vehicle so as to indicate to a user that the position or route is not an actual route, but instead represents a predicted route that is subject to change over time based on predictions from collision awareness system 140.
  • In this example, collision awareness system 140 may connect to an aircraft system of a particular aircraft 612 (e.g., the EFB application running on user interface device 104) over the extended range wireless network (via a wireless router, such as wireless router 710 of FIG. 7), where the particular aircraft may be among the transient surface objects that collision awareness system 140 is monitoring. In some examples, collision awareness system 140 may establish a secure wireless communication channel over the extended range wireless network with the EFB application running on user interface device 104, or with another aircraft system on the particular aircraft, and then transmit its information, including the position and velocity information for the one or more transient surface objects, over the secure wireless communication channel.
  • In various examples, the EFB application executing on user interface device 104 may thus receive all of the information transmitted by collision awareness system 140, and receive all of the benefit of collision awareness system 140, simply with a software upgrade to an EFB application that implements examples of this disclosure. Thus, in various examples, a pilot or flight crew may gain the benefits of this disclosure without requiring any new hardware (since an EFB application of this disclosure may execute on an EFB tablet computer or other EFB that a flight crew already has), and without requiring any hardware or software change to the installed equipment of the aircraft itself, and thus without the need to go through a certification process of any new installed aircraft systems. A pilot, flight crew, or aircraft operator may also enjoy the benefits of an implementation of this disclosure without the need to rely on new hardware or software from an original equipment manufacturer (OEM) of installed hardware or software systems installed in the aircraft.
  • FIG. 7 is a conceptual block diagram depicting an example aerodrome network system with example ground surface sensors that may be used in conjunction with collision awareness system 140. In the example shown in FIG. 7, the example aerodrome network system includes collision awareness system 140 connected to a wireless router 710 via communication circuitry 144. Collision awareness system 140 may be communicatively connected to a number of airport ground surface sensors of various types, including a surface movement radar (SMR) transceiver 720, multilateration sensor(s) 722, multilateration reference transmitter 724, and/or to additional types of airport ground surface sensors via multilateration system LAN 770 or redundant airport system ethernet local area networks (LANs) 716A and 716B ("airport system LANs 716"). Multilateration sensor(s) 722 may gather data on the movement of surface vehicles 111 and provide the data to collision awareness system 140 via network 130 (e.g., airport system ethernet LAN 716, etc.), such that collision awareness system 140 may use such data to confirm various predictions based on non-sensor data.
  • In some examples, processing circuitry 142 may receive ground surface sensor data for a first vehicle and/or a second one of vehicles 111. Processing circuitry 142 may receive ground surface sensor data collected as described in various techniques of U.S. Patent Publication No. 2016/0196754, by Lawrence J. Surace, filed Jan. 6, 2015 , the entire content of which is hereby incorporated by reference in its entirety. For example, SMR transceiver 720 is connected to at least one SMR antenna 726. SMR transceiver 720 and SMR antenna 726 may be configured to detect, monitor and gather data from various airport ground surfaces and detect transient surface objects on the airport ground surfaces, including aircraft, ground vehicles, and any other moving or impermanent objects on the ground surfaces (or "transient surface objects"). In such examples, processing circuitry 142 may use data from one or more SMR transceivers 720, multilateration sensors 722, or other airport ground surface sensors, and combine the data from these multiple airport ground surface sensors to generate position and velocity information for the one or more transient surface objects on the airport ground surfaces. Processing circuitry 142 may use the position and velocity information for the one or more transient surface objects on the airport ground surfaces to determine predicted positions along predicted paths by extrapolating a position using the current position, velocity information and predicted changes in velocity or position as informed by historical navigation route data.
  • In any event, processing circuitry 142 may then determine a current position of the first vehicle and/or the second vehicle from the ground surface sensor data. In some instances, processing circuitry 142 may determine the current position for one vehicle, whereas the other vehicle may be parked and out of range of the ground surface sensors. In any event, processing circuitry 142 may predict a current position of the other vehicle using historical navigation route data. In such examples, processing circuitry, may identify both a predicted position of the first vehicle and a predicted position of the second vehicle using the current position of the first and second vehicles 111.
  • Multilateration sensors 722 may be configured to detect, monitor and gather data from various airport ground surfaces and to detect transient surface objects on the airport ground surfaces, in ways that may complement the detection by SMR transceiver 720. Example multilateration sensor data collection techniques are described in U.S. Patent Publication No. 2016/0196754 . For example, multilateration sensors 722 may be implemented as omnidirectional antenna sensors stationed at various remote locations around the airport.
  • In some examples, collision awareness system 140 may be connected to any one or more sensors of a wide variety of other types of sensors configured to detect transient surface objects on the airport ground surfaces. For example, processing circuitry 142 of collision awareness system 140 may be communicatively connected to and configured to receive data from one or more microwave sensors, optical imaging sensors, ultrasonic sensors, lidar transceivers, infrared sensors, and/or magnetic sensors. In some examples, collision awareness system 140 may incorporate features and/or components of an airport ground surface monitoring system, such as Advanced Surface Movement Guidance and Control System (A-SMGCS) or Airport Surface Detection Equipment-Model X (ASDE-X) System. For example, collision awareness system 140 may incorporate one or more of the SMR transceiver 720 and SMR antenna 726, multilateration sensors 722, and/or other airport ground surface sensors.
  • In some examples, collision awareness system 140 may incorporate or integrate signals or sensor input from a combination of surface movement radar, multilateration sensors, and satellites. One or more of the types of airport ground surface sensors may be configured to generate signals indicative of positions of transient ground objects to within a selected accuracy, such as five meters, for example, enabling processing circuitry 142 to generate position and velocity information for the transient ground objects of a similar level of accuracy. Processing circuitry 142 may also be at least at times (at all times or at only certain times) communicatively connected to sensors positioned outside the vicinity of the airport, such as imaging sensors hosted on satellites, airships, or drones with imaging and communication capability.
  • Processing circuitry 142 may be configured to use data from SMR transceiver 720 and/or multilateration sensors 722 and multilateration reference transmitter 724 to evaluate or determine positions and velocities of transient surface objects on the airport ground surfaces, and to generate position and velocity information for one or more transient surface objects on the airport ground surfaces based at least in part on data from the SMR transceiver 720 and/or from the multilateration sensors 722, and/or from one or more other airport ground surface sensors. In some examples, processing circuitry 142 may generate positions and velocities at one or more times of one or more airport ground vehicles or other ground support equipment, such as refueler trucks, pushback tugs, airport busses, container loaders, belt loaders, baggage carts, catering vehicles, emergency vehicles, snow removal vehicles, or ground maintenance equipment, for example.
  • Multilateration sensors 722 may in some examples perform active cooperative interrogation of moving aircraft on the airport ground surfaces. For example, multilateration sensors 722 may transmit interrogation signals via a 1030/1090 megahertz (MHz) Traffic Collision Avoidance System (TCAS) surveillance band. In some examples, multilateration sensors 722 may include Automatic Dependent Surveillance - Broadcast (ADS-B) transceivers (e.g., Mode S ADS-B transceivers) configured to receive ADS-B messages from aircraft on the airport ground surface. Various aircraft in motion on the airport ground surfaces (at least, aircraft that have their ADS-B systems active while on the ground) may automatically transmit ADS-B messages that may be received by multilateration sensors 722. Multilateration sensors 722 using ADS-B may receive the ADS-B messages and communicate the ADS-B messages, potentially with additional data such as time of receipt, to processing circuitry 142, thus facilitating processing circuitry 142 determining and generating position and velocity information for the responding aircraft.
  • In some examples, processing circuitry 142 of collision awareness system 140 may be configured to output the position and velocity information generated for transient surface objects to communication circuitry 144 and thus to extended range wireless router 710 for transmission over a wireless local area network. For example, processing circuitry 142 of collision awareness system 140 may output position and velocity information for one or more transient ground surface objects at a selected ethernet connection or output port to an IP address, where the selected ethernet connection or output port is connected to communication circuitry 144, for example, via a WNIC.
  • The extended range wireless network established by wireless router 710 (and potentially additional wireless routers in communicative connection with communication circuitry 144) may extend its range across the airport, and include all of the taxiways, runways, gate areas, apron areas, hangar bays, and other trafficways in its range. The extended range wireless network provided by wireless router 710 may thus include all of the aircraft on the airport ground surfaces within range, and may potentially provide wireless connectivity in the cockpits of all of the aircraft, including to EFBs of the pilots or flight crew of the various aircraft. In some examples, extended range wireless router 710 may be incorporated together with collision awareness system 140 in a single unit or component.
  • Thus, in various examples, in collision awareness system 140 including processing circuitry 142 and communication circuitry 144, processing circuitry 142 is configured to receive data from one or more airport ground surface sensors (e.g., one or both of SMR transceiver 720 and multilateration sensors 722) configured to detect transient surface objects on an airport ground surface. Processing circuitry 142 of collision awareness system 140 may be further configured to generate position and velocity information for one or more transient surface objects on the airport ground surface based at least in part on the data from the one or more airport ground surface sensors. Communication circuitry 144 of collision awareness system 140 may be configured to receive the position and velocity information for the one or more transient surface objects from processing circuitry 142 and to output the position and velocity information for the one or more transient surface objects to wireless router 710 for transmission over a wireless local area network.
  • Any of a wide variety of processing devices, such as collision awareness system 140, other components that interface with collision awareness system 140 and/or implement one or more techniques of collision awareness system 140, or other central processing units, ASICs, graphical processing units, computing devices, or processing devices of any other type may perform process 300 or portions or aspects thereof. Collision awareness system 140 and/or other components that interface with collision awareness system 140 and/or implement one or more techniques of collision awareness system 140 as disclosed above may be implemented in any of a variety of types of circuit elements. For example, processors of collision awareness system 140 or other components that interface with collision awareness system 140 and/or implement one or more techniques of collision awareness system 140 may be implemented as one or more ASICs, as a magnetic nonvolatile random-access memory (RAM) or other types of memory, a mixed-signal integrated circuit, a central processing unit (CPU), a field programmable gate array (FPGA), a microcontroller, a programmable logic controller (PLC), a system on a chip (SoC), a subsection of any of the above, an interconnected or distributed combination of any of the above, or any other type of component or one or more components capable of being configured to predict collision zones at a prospective time using guidance features of an aerodrome, historical data, and/or clearance information, and perform other functions in accordance with any of the examples disclosed herein.
  • Functions executed by electronics associated with the devices systems described herein may be implemented, at least in part, by hardware, software, firmware, or any combination thereof. For example, various aspects of the techniques may be implemented within one or more processors, including one or more microprocessors, DSPs, ASICs, FPGAs, or any other equivalent integrated or discrete logic circuitry, as well as any combinations of such components, embodied in electronics included in components of system 140 or other systems described herein. The terms "processor," "processing device," or "processing circuitry" may generally refer to any of the foregoing logic circuitry, alone or in combination with other logic circuitry, or any other equivalent circuitry.
  • Such hardware, software, firmware may be implemented within the same device or within separate devices to support the various operations and functions described in this disclosure. In addition, any of the described units, modules or components may be implemented together or separately as discrete but interoperable logic devices. Depiction of different features as modules or units is intended to highlight different functional aspects and does not necessarily imply that such modules or units must be realized by separate hardware or software components. Rather, functionality associated with one or more modules or units may be performed by separate hardware or software components, or integrated within common or separate hardware or software components. For example, while collision awareness system 140 is shown as a separate system in FIG. 1, collision awareness system 140 may execute on one or more of data server(s) 132, vehicles 111, traffic controller 114, user interface devices 104, data stores 105, or any combination thereof. In one example, collision awareness system 140 may be implemented across multiple devices, such as data server(s) 132 and vehicles 111 simultaneously.
  • When implemented in software, functionality ascribed to the devices and systems described herein may be embodied as instructions on a computer-readable medium such as random access memory (RAM), read-only memory (ROM), non-volatile random access memory (NVRAM), electrically erasable programmable read-only memory (EEPROM), FLASH memory, magnetic data storage media, optical data storage media, or the like. The instructions may be executed to support one or more aspects of the functionality described in this disclosure. The computer-readable medium may be non-transitory.
  • Various aspects of the disclosure have been described. These and other aspects are within the scope of the following claims.

Claims (15)

  1. A ground collision awareness system comprising:
    a memory configured to store:
    historical navigation route data for one or more reference vehicles, wherein the historical navigation route data are based on transponder positional data, and
    a plurality of aerodrome guidance features for one or more aerodrome locations, wherein the aerodrome guidance features include guidance marker information; and
    a processor in communication with the memory, wherein the processor is configured to:
    determine a predicted path of a first vehicle, the predicted path comprising a first portion and a second portion, the first portion of the predicted path being predicted using the guidance marker information, and the second portion of the predicted path being predicted using the historical navigation route data;
    determine a predicted position of the first vehicle along the predicted path at a prospective time;
    determine a predicted position of a second vehicle with respect to approximately the same prospective time;
    perform a comparison of a first vehicle envelope for the first vehicle and a second vehicle envelope for the second vehicle at the predicted positions;
    identify an overlap of the first vehicle envelope and the second vehicle envelope; and
    determine a predicted collision zone of the first vehicle and the second vehicle at the prospective time based at least in part on the overlap of the first vehicle envelope and the second vehicle envelope.
  2. The system of claim 1, wherein to determine the predicted path of the first vehicle, the processor is further configured to:
    determine a first initially predicted path of the first vehicle using the guidance marker information and the historical navigation route data;
    identify a first likelihood of the first vehicle traveling the first initially predicted path;
    determine a second initially predicted path of the first vehicle using the guidance marker information and the historical navigation route data;
    identify a second likelihood of the first vehicle traveling the second initially predicted path; and
    determine the predicted path of the first vehicle based on a combination of the first initially predicted path of the first vehicle and the second initially predicted path of the first vehicle, the combination being based at least in part on the first likelihood and the second likelihood.
  3. The system of claim 1, wherein the processor is further configured to:
    identify clearance information of a traffic controller defining one or more target marks for the first vehicle;
    identify one or more target aiming features from the plurality of aerodrome guidance features based at least in part on the clearance information, the one or more target aiming features configured to provide guidance through a particular part of a particular aerodrome location toward the one or more target marks;
    identify the first portion of the predicted path using the one or more target aiming features; and
    identify the second portion of the predicted path using the clearance information and historical navigation route data, the second portion of the predicted path including a destination location of the first vehicle defined by the clearance information as a target mark of the one or more target marks.
  4. The system of claim 1, wherein the processor is further configured to:
    receive ground surface sensor data for the first vehicle or the second vehicle;
    determine a current position of the first vehicle or the second vehicle from the ground surface sensor data; and
    identify the predicted position of the first vehicle or the predicted position of the second vehicle using the current position of the first vehicle or the second vehicle.
  5. The system of claim 1, wherein the processor is further configured to:
    determine movement information of the first vehicle at a current position, the movement information including speed information of the first vehicle; and
    identify the predicted position of the first vehicle using the movement information and the historical navigation route data.
  6. The system of claim 1, wherein the processor is further configured to:
    receive a current position of the first vehicle; and
    transmit the predicted collision zone from a remote server to the first vehicle using a wireless network.
  7. The system of claim 1, wherein the processor is further configured to:
    identify a predicted path of the second vehicle using a predicted current position of the second vehicle, the predicted current position of the second vehicle based on the historical navigation route data; and
    determine the predicted position of the second vehicle using the predicted current position of the second vehicle and one or more of: the historical navigation route data, the plurality of aerodrome guidance features, or clearance information for the second vehicle.
  8. The system of claim 1, wherein the processor is further configured to:
    transmit the predicted collision zone to a device corresponding to at least one of: the first vehicle, the second vehicle, a third vehicle configured to transport the first or second vehicles, or a remote server.
  9. The system of claim 1, wherein the first vehicle includes the ground collision awareness system or a taxiway comprising surface guidance markings, and wherein the second portion includes an airport apron region.
  10. A method of predicting vehicle collision zones, the method comprising:
    obtaining, by processing circuitry of a ground collision awareness system, historical navigation route data for one or more reference vehicles, the historical navigation route data being based on transponder positional data;
    identifying, by the processing circuitry, a plurality of aerodrome guidance features for a particular aerodrome location, the aerodrome guidance features including guidance marker information;
    determining, by the processing circuitry, a predicted path of a first vehicle, the predicted path comprising a first portion and a second portion, the first portion of the predicted path being predicted using the guidance marker information, and the second portion of the predicted path being predicted using the historical navigation route data;
    determining, by the processing circuitry, a predicted position of the first vehicle along the predicted path at a prospective time;
    determining, by the processing circuitry, a predicted position of a second vehicle with respect to approximately the same prospective time;
    performing, by the processing circuitry, a comparison of a first vehicle envelope for the first vehicle and a second vehicle envelope for the second vehicle at the predicted positions;
    identifying, by the processing circuitry, an overlap of the first vehicle envelope and the second vehicle envelope; and
    determining, by the processing circuitry, a predicted collision zone of the first vehicle and the second vehicle at the prospective time based at least in part on the overlap of the first vehicle envelope and the second vehicle envelope.
  11. The method of claim 10, wherein determining the predicted path further comprises:
    determining a first initially predicted path of the first vehicle using the guidance marker information and the historical navigation route data;
    identifying a first likelihood of the first vehicle traveling the first initially predicted path;
    determining a second initially predicted path of the first vehicle using the guidance marker information and the historical navigation route data;
    identifying a second likelihood of the first vehicle traveling the second initially predicted path; and
    determining the predicted path of the first vehicle based on a combination of the first initially predicted path of the first vehicle and the second initially predicted path of the first vehicle, the combination being based at least in part on the first likelihood and the second likelihood.
  12. The method of claim 10, wherein determining the predicted path further comprises:
    identifying, by the processing circuitry, clearance information of a traffic controller defining one or more target marks for the first vehicle;
    identifying, by the processing circuitry, one or more target aiming features from the plurality of aerodrome guidance features based at least in part on the clearance information, the one or more target aiming features configured to provide guidance through a particular part of the particular aerodrome location toward the one or more target marks;
    identifying, by the processing circuitry, the first portion of the predicted path using the one or more target aiming features; and
    identifying, by the processing circuitry, the second portion of the predicted path using the clearance information and historical navigation route data, the second portion of the predicted path including a destination location of the first vehicle defined by the clearance information as a target mark of the one or more target marks.
  13. The method of claim 10, further comprising:
    receiving ground surface sensor data for the first vehicle or the second vehicle;
    determining a current position of the first vehicle or the second vehicle from the ground surface sensor data; and
    identifying the predicted position of the first vehicle or the predicted position of the second vehicle using the current position of the first vehicle or the second vehicle.
  14. The method of claim 10, further comprising:
    determining movement information of the first vehicle at a current position, the movement information including speed information of the first vehicle; and
    identifying the predicted position of the first vehicle using the movement information and the historical navigation route data.
  15. The method of claim 10, wherein the processing circuitry corresponds to a remote server, wherein the method further comprises:
    receiving, by the processing circuitry of the remote server, a current position of the first vehicle; and
    transmitting, by the processing circuitry, the predicted collision zone from the remote server to the first vehicle.
EP21154155.2A 2020-02-14 2021-01-28 Collision awareness using historical data for vehicles Pending EP3866139A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN202011006508 2020-02-14
US17/070,830 US11854418B2 (en) 2020-02-14 2020-10-14 Collision awareness using historical data for vehicles

Publications (1)

Publication Number Publication Date
EP3866139A1 true EP3866139A1 (en) 2021-08-18

Family

ID=74418171

Family Applications (1)

Application Number Title Priority Date Filing Date
EP21154155.2A Pending EP3866139A1 (en) 2020-02-14 2021-01-28 Collision awareness using historical data for vehicles

Country Status (2)

Country Link
EP (1) EP3866139A1 (en)
CN (1) CN113838309A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12007774B2 (en) 2022-03-25 2024-06-11 Rockwell Collins, Inc. System and method for guidance integrity monitoring for low-integrity modules

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SE544549C2 (en) * 2020-03-13 2022-07-12 Saab Ab A method, computer program product, system and craft for collision avoidance
US20220366794A1 (en) * 2021-05-11 2022-11-17 Honeywell International Inc. Systems and methods for ground-based automated flight management of urban air mobility vehicles

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160196754A1 (en) 2015-01-06 2016-07-07 Honeywell International Inc. Airport surface monitoring system with wireless network interface to aircraft surface navigation system
US20190228668A1 (en) * 2018-01-24 2019-07-25 Honeywell International Inc. Method and system for automatically predicting a surface movement path for an aircraft based on historical trajectory data
US20190381977A1 (en) * 2018-06-15 2019-12-19 Honeywell International Inc. Methods and systems for vehicle contact prediction and auto brake activation

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160196754A1 (en) 2015-01-06 2016-07-07 Honeywell International Inc. Airport surface monitoring system with wireless network interface to aircraft surface navigation system
US20190228668A1 (en) * 2018-01-24 2019-07-25 Honeywell International Inc. Method and system for automatically predicting a surface movement path for an aircraft based on historical trajectory data
US20190381977A1 (en) * 2018-06-15 2019-12-19 Honeywell International Inc. Methods and systems for vehicle contact prediction and auto brake activation

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12007774B2 (en) 2022-03-25 2024-06-11 Rockwell Collins, Inc. System and method for guidance integrity monitoring for low-integrity modules

Also Published As

Publication number Publication date
CN113838309A (en) 2021-12-24

Similar Documents

Publication Publication Date Title
EP3866139A1 (en) Collision awareness using historical data for vehicles
US11854418B2 (en) Collision awareness using historical data for vehicles
US11900823B2 (en) Systems and methods for computing flight controls for vehicle landing
EP3043331A2 (en) Airport surface monitoring system with wireless network interface to aircraft surface navigation system
US11113980B2 (en) Boolean mathematics approach to air traffic management
US20210255616A1 (en) Systems and methods for automated cross-vehicle navigation using sensor data fusion
EP3531400A1 (en) Systems and methods for detecting potential surface collisions and providing warnings onboard an aircraft or airport vehicle
US11763555B2 (en) System and method for ground obstacle detection and database management
EP3693948A1 (en) Detect and avoid integration with controller pilot data link communications (cpdlc)
CN111512354B (en) Aircraft traffic control method
EP3276590A1 (en) Prediction of vehicle maneuvers
US11847925B2 (en) Systems and methods to display an elevated landing port for an urban air mobility vehicle
EP4063987A1 (en) Systems and methods for identifying landing zones for unmanned aircraft
US20220309934A1 (en) Systems and methods for detect and avoid system for beyond visual line of sight operations of urban air mobility in airspace
US20220309931A1 (en) Systems and methods for guiding vehicles to charging points
US20230410667A1 (en) Autonomous air taxi separation system and method
EP4080482A1 (en) System and method for obstacle detection and database management
US11994880B2 (en) Methods and systems for unmanned aerial vehicles to detect and avoid other flying machines
EP3859712A1 (en) Collision awareness using cameras mounted on a vehicle
EP4064245A1 (en) Systems and methods for detect and avoid system for beyond visual line of sight operations of urban air mobility in airspace
CN116235232B (en) Autonomous air taxi-interval system and method
EP4080481A1 (en) Systems and methods to display an elevated landing port for an urban air mobility vehicle
EP4063177A1 (en) Systems and methods for guiding vehicles to charging points

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20211025

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230421