WO2019067695A1 - Commande de vol à l'aide d'une vision par ordinateur - Google Patents

Commande de vol à l'aide d'une vision par ordinateur Download PDF

Info

Publication number
WO2019067695A1
WO2019067695A1 PCT/US2018/053084 US2018053084W WO2019067695A1 WO 2019067695 A1 WO2019067695 A1 WO 2019067695A1 US 2018053084 W US2018053084 W US 2018053084W WO 2019067695 A1 WO2019067695 A1 WO 2019067695A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
aerial vehicle
image
uav
determining
Prior art date
Application number
PCT/US2018/053084
Other languages
English (en)
Inventor
Guy Bar-Nahum
Hong-Bin YOON
Karthik GOVINDASWAMY
Hoang A. NGUYEN
Original Assignee
Airspace Systems, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US15/729,581 external-priority patent/US10325169B2/en
Priority claimed from US16/142,452 external-priority patent/US10514711B2/en
Application filed by Airspace Systems, Inc. filed Critical Airspace Systems, Inc.
Publication of WO2019067695A1 publication Critical patent/WO2019067695A1/fr

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/12Target-seeking control
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/91Radar or analogous systems specially adapted for specific applications for traffic control
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/411Identification of targets based on measurements of radar reflectivity
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/415Identification of targets based on measurements of movement associated with the target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/417Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section involving the use of neural networks
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • B64U10/16Flying platforms with five or more distinct rotor axes, e.g. octocopters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U60/00Undercarriages
    • B64U60/50Undercarriages with landing legs
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U70/00Launching, take-off or landing arrangements
    • B64U70/20Launching, take-off or landing arrangements for releasing or capturing UAVs in flight by another aircraft

Definitions

  • unmanned aerial vehicles and drones are controlled using flight controllers manually operated by a human user. This requires the human user to have a visual sight of the aerial vehicle and/or an image captured by the aerial vehicle. However, this may be challenging in circumstances where a trained human user is unavailable or unable to react quickly enough to perform a desired flight maneuver. Additionally if communication between the aerial vehicle and the flight controller is lost or becomes unreliable, the aerial vehicle may be unable to complete a desired flight operation.
  • Figure 1 a block diagram illustrating an embodiment of a system for managing an airspace.
  • Figure 2 is a block diagram illustrating an embodiment of an aerial vehicle.
  • Figure 3 A is a diagram illustrating a front view of a UAV in accordance with some embodiments.
  • FIG. 3B is a diagram illustrating a side view of a UAV in accordance with some embodiments.
  • unmanned aerial vehicle 301 may be used to implement a UAV, such as UAV 100.
  • Figure 4 is a flowchart illustrating an embodiment of a process for automatically controlling flight of a vehicle.
  • Figure 5 is a flowchart illustrating an embodiment of a process for determining a vector associated with a target.
  • Figure 6 is an illustration illustrating an example of a two dimensional vector determined using an image from a camera feed.
  • Figure 7 is an illustration illustrating an example of a three dimensional vector determined using two dimensional measurements.
  • Figure 8 shows example equations that can be utilized to determine a three dimensional vector.
  • Figure 9 is an illustration illustrating examples of functions utilized to adjust component values of a vector.
  • Figure 10 shows example equations that can utilized to scale/alter components values of a vector.
  • Figure 11 illustrates an embodiment of a spacio-temporal awareness engine 1100.
  • Figure. 12 illustrates an embodiment of a tree based region selection process.
  • Figure 13 illustrates an embodiment of a tree-based region selection process.
  • Figure 14 illustrates an embodiment of parallel image processing process 1400.
  • Figure 15 illustrates a tracking system 1500 in accordance with one embodiment.
  • Figure 16 illustrates an embodiment of a quadtree 1600.
  • Figure 17 illustrates an embodiment of a system 1700 for converting camera input into a vector for low resolution tracker 1510.
  • Figure 18 illustrates an embodiment of a subsystem 1800 for prioritizing a region of interest in the focus of a camera.
  • Figure 19 illustrates several components of an exemplary region of interest tracking system 1900 in accordance with one embodiment.
  • Figure 20 illustrates an embodiment of drone operation logic.
  • Figure 21 illustrates an embodiment of a system operating a multimodal sensor empowered awareness engine.
  • Figure 22 illustrates an embodiment of a process for operating multimodal sensor empowered awareness engine.
  • Figure 23 illustrates an embodiment of a system operating a multimodal sensor empowered awareness engine.
  • Figure 24 illustrates an embodiment of a system operating a multimodal sensor empowered awareness engine.
  • the invention can be implemented in numerous ways, including as a process; an apparatus; a system; a composition of matter; a computer program product embodied on a computer readable storage medium; and/or a processor, such as a processor configured to execute instructions stored on and/or provided by a memory coupled to the processor.
  • these implementations, or any other form that the invention may take, may be referred to as techniques.
  • the order of the steps of disclosed processes may be altered within the scope of the invention.
  • a component such as a processor or a memory described as being configured to perform a task may be implemented as a general component that is temporarily configured to perform the task at a given time or a specific component that is manufactured to perform the task
  • the term 'processor' refers to one or more devices, circuits, and/or processing cores configured to process data, such as computer program instructions.
  • Unmanned Aerial Platforms including Unmanned Aerial Vehicles (UAV) and
  • Aerial Drones may be used for a variety of applications. However, some applications may pose a risk to people or property. UAVs have been used to carry contraband, including drugs, weapons, and counterfeit goods across international borders. It is further possible that UAVs may be used for voyeuristic or industrial surveillance, to commit terrorist acts such as spreading toxins or transporting an explosive device. In view of this risk posed by malicious UAVs, it may be necessary to have a system to intercept, capture, and transport away a UAV that has entered a restricted area.
  • An interceptor aerial vehicle may be utilized to monitor and/or capture offending aerial vehicles. For example, when a threatening UAV is detected, the interceptor aerial vehicle may be deployed to fly to the target UAV and capture it (e.g., using a net fired from the interceptor aerial vehicle over the target to capture the target in the net for transport to a safe location). However, because the target UAV can move, the interceptor aerial vehicle needs to be able to follow the target UAV and get close enough until the target UAV is within range of a capture system. For at least the previously mentioned reasons, it may not be desirable to manually control the interceptor aerial vehicle to follow the target UAV. In some embodiments, the interceptor aerial vehicle is able to autonomously track and fly to the target UAV using machine vision based on images captured by a camera of the interceptor aerial vehicle.
  • an image is received and a target aerial vehicle is detected in the image.
  • a stream of images captured by one or more cameras of an interceptor aerial vehicle is analyzed using deep learning, neural networks, and/or other machine learning techniques to recognize a target aerial vehicle in the captured image.
  • a three-dimensional relative location of the target aerial vehicle is determined with respect to a reference aerial vehicle based on the image. For example, azimuth and attitude directions as well as distance from the reference aerial vehicle are determined based on the image.
  • a flight control operation is performed based on three-dimensional relative location. For example, the reference aerial vehicle is guided towards the three-dimensional relative location at a dynamically determined speed.
  • FIG. 1 a block diagram illustrating an embodiment of a system for managing an airspace.
  • Examples of interceptor aerial vehicle 102 include a drone, a multirotor aircraft, an airplane, a UAV, a helicopter, and any other vehicle capable of flight.
  • Interceptor aerial vehicle 102 may be deployed to patrol a specified airspace and/or monitor/interdict a target aerial vehicle. For example, interceptor aerial vehicle 102 follows and/or navigates to a location associated with the target aerial vehicle (e.g., target aerial vehicle 110).
  • Ground station 104 is used to manage interceptor aerial vehicle 102.
  • Ground station 104 is in communication with interceptor aerial vehicle 102.
  • ground station 104 provides instructions, commands, sensor data, and/or other data that can be used by interceptor aerial vehicle 102 to monitor an airspace and/or navigate to and/or capture a target aerial vehicle. Interceptor aerial vehicle 102 may provide status and/or reporting data back to ground station 104.
  • Ground station 104 may include one or more sensors utilized to detect a location of aerial vehicles within an airspace.
  • ground station 104 may include one or more radars, cameras, wireless communication sensors, and/or LIDAR sensors monitoring the airspace.
  • Ground station 104 may also receive information from one or more other ground-based sensors.
  • One example of the ground-based sensor is ground-based sensor 106. Examples of ground-based sensor 106 may include one or more of radars, cameras, wireless communication sensors, and/or LIDAR sensors.
  • interceptor aerial vehicle 102 may be deployed to interdict/capture the unauthorized aerial vehicle.
  • An example of the unauthorized aerial vehicle is target aerial vehicle 110 (e.g., a drone, a multirotor aircraft, an airplane, a UAV, a helicopter, or any other vehicle capable of flight).
  • the detected location of target aerial vehicle 110 may be provided to interceptor aerial vehicle 102 to allow it to automatically and autonomously fly towards the direction of the detected location.
  • Interceptor aerial vehicle 102 includes one or more computer vision sensors and a processor that can be used to dynamically detect the unauthorized aerial vehicle and autonomously navigate interceptor aerial vehicle 102 towards the target aerial vehicle.
  • ground station 104 is included in a mobile platform that is able to be transported to different physical locations.
  • ground station 104 is on a movable platform with wheels that may be towed or may include an engine to also serve as a vehicle.
  • ground station 104 includes a hangar that can be used to transport and house interceptor aerial vehicle 102.
  • ground station 104 is able to communicate in real time with the interceptor aerial vehicle 102 to be able to monitor, and if necessary, redirect detection and tracking based on security or threat level.
  • the telecommunications structure of ground station 104 is configured to receive and transmit signals. More specifically, the transmission protocol may include but is not limited to RF, wireless/Wi-Fi, Bluetooth Zigbee, cellular, and others.
  • the telecommunications structure is configured to receive multiple streams of communication in different protocols, and to combine and thread the different communication inputs. Further, the telecommunications structure may also be configured to receive low altitude signals, such as light transmission in various colors, intensities, patterns and shapes, which may be used to identify a target drone.
  • an aerial vehicle launched from the hangar may contain onboard processing capability to perform the substantially same detection, and provide the information to ground station 104.
  • the identity of the target aerial vehicle may be further compared and associated with a user by additional criteria, such as authentication of user equipment. For example, if a target drone is identified as friend or foe, at a first level of authentication, an owner or user associated with a mobile computing device may receive a query, such as a text message, email or other communication. The user may be required to authenticate ownership, operation, control, or other association with the target drone, prior to the target drone being cleared for operation.
  • the first level of authentication does not indicate that the target drone is a "friend”
  • further targeting and interdiction may occur.
  • the second level of authentication does not result in the user providing sufficient factors or clearances to confirm association with the target drone, even if the target drone is determined to be a "friend" at the first authentication stage, that classification may be converted to "foe,” or the "friend” determination may not be implemented, if the second level or factor of authentication does not result in a confirmation that the user is associated with the target drone, and that the association meets the criteria for "friend” status.
  • interceptor aerial vehicle 102 may be deployed to patrol an airspace.
  • the interceptor aerial vehicle and associated interface may be programmed to automatically fly a patrol path.
  • the user interface may transition to another patrol path so that the user can continue to monitor patrol pads for other aerial vehicles.
  • the user interface may transition to interdiction mode.
  • configurations may be provided where one or more users may continuously be in surveillance for patrol mode, and toggle or switch between multiple aerial vehicles, toggling off of aerial vehicles that have moved from surveillance mode to another mode.
  • FIG. 2 is a block diagram illustrating an embodiment of an aerial vehicle.
  • UAV aerial vehicle
  • UAV 200 is an example of aerial vehicle 102 of Figure 1.
  • UAV 200 comprises a radar system 202, one or more IMUs 206, and an interdiction system 207.
  • UAV 200 may include one or more other systems and/or components that are not shown (e.g., propellers, flight control system, landing control system, power system, etc.).
  • Radar system 202 is comprised of one or more antennas 203 and one or more processors 204.
  • the one or more antennas 203 may be a phased array, a parabolic reflector, a slotted waveguide, or any other type of antenna design used for radar.
  • the one or more processors 204 are configured to excite a transmission signal for the one or more antennas 203.
  • the transmission signal has a frequency fo. Depending on the antenna design, the transmission signal may have a frequency between 3 MHz to 220 GHz.
  • the one or more antennas are configured to operate in a frequency range of 79 MHz to 1 GHz.
  • the one or more antennas 203 are configured to transmit the signal.
  • the transmission signal may propagate through space.
  • the transmission signal may reflect off one or more objects.
  • the reflection signal may be received by the one or more antennas 203.
  • the reflection signal is received by a subset of the one or more antennas 203.
  • the reflection signal is received by all of the one or more antennas 203.
  • the strength (amplitude) of the received signal depends on a plurality of various factors, such as a distance between the one or more antennas 203 and the reflecting object, the medium in which the signal is transmitted, the environment, and the material of the reflecting object, etc.
  • the one or more processors 204 are configured to receive the reflection signal from the one or more antennas 203.
  • the one or more processors 204 are configured to determine a velocity of the detected object based on the transmission signal and the reflection signal. The velocity may be determined by computing the Doppler shift.
  • a detected object may have one or more associated velocities.
  • An object without any moving parts, such as a balloon may be associated with a single velocity.
  • An object with moving parts, such as a car, helicopter, UAV, plane, etc., may be associated with more than one velocity.
  • the main body of the object may have an associated velocity.
  • the moving parts of the object may each have an associated velocity.
  • a UAV is comprised of a body portion and a plurality of propellers. The body portion of the UAV may be associated with a first velocity.
  • Each of the propellers may be associated with corresponding velocities.
  • the one or more antennas 203 are a phased antenna array.
  • a beam associated with the phase antenna array may be directed towards the object.
  • a beam former e.g., the one or more processors 204 may control the phase and relative amplitude of the signal at each transmitting antenna of the antenna array, in order to create a pattern of constructive and destructive interference in the wave front.
  • Radar system 202 is coupled to one or more inertial measurement units 206.
  • the one or more inertial measurement units 206 are configured to calculate attitude, angular rates, linear velocity, and/or a position relative to a global reference frame.
  • the one or more processors 204 may use the measurements from the one or more IMUs 206 to determine an EGO motion of UAV 200.
  • the one or more processors 204 may also use one or more extended Kalman filters to smooth the measurements from the one or more inertial measurement units 206.
  • One or more computer vision-based algorithms e.g., optical flow
  • the one or more processors 204 may be configured to remove the EGO motion data of UAV 200 from the reflection signal data to determine one or more velocities associated with a detected object. From UAV 200's perspective, every detected item appears to be moving when UAV 200 is flying. Removing the EGO motion data from the velocity determination allows radar system 202 to determine which detected objects are static and/or which detected objects are moving. The one or more determined velocities may be used to determine a micro-Doppler signature of an object.
  • the one or more processors 204 may generate a velocity profile from the reflected signal to determine a micro-Doppler signature associated with the detected object.
  • the velocity profile compares a velocity of the reflection signal(s) with an amplitude (strength) of the reflection signal(s).
  • the velocity axis of the velocity profile is comprised of a plurality of bins.
  • a velocity of the reflection signal with the highest amplitude may be identified as a reference velocity and the amplitude associated with the reference velocity may be associated with a reference bin (e.g., bin Bo).
  • the one or more other velocities included in the reflection signal may be compared with respect to the reference velocity.
  • Each bin of the velocity profile represents an offset with respect to the reference velocity.
  • a corresponding bin for the one or more other velocities included in the reflection signal may be determined.
  • a determined bin includes an amplitude associated with one of the one or more other velocities included in the reflection signal.
  • a reflection signal may be a reflection signal associated with a UAV.
  • the UAV is comprised of a main body and a plurality of propellers.
  • the velocity of a UAV body may be represented as a reference velocity in the velocity profile.
  • the velocity of a UAV propeller may be represented in a bin offset from the reference velocity.
  • the bin associated with the reference velocity may store an amplitude associated with the velocity of the UAV body.
  • the bin offset from the reference bin may store an amplitude associated with the velocity of a UAV propeller.
  • a direction of a beam of the phased antenna array may be focused towards a detected object such that a plurality of antenna elements 203 receive a reflection signal from the detected object.
  • the plurality of antenna elements that receive a reflection signal may be adjacent to the antenna element that detected the object during the one dimensional MIMO scan.
  • a velocity profile for each of the received corresponding reflection signals may be generated.
  • the velocity profile for each of the received corresponding reflection signals may be combined with the velocity profile of the antenna element that detected the object during the one dimensional MIMO scan.
  • the combined velocity profile includes the same bins as one of the velocity profiles, but a bin of the combined velocity profile stores a plurality of amplitudes from the plurality of velocity profiles.
  • a maximum amplitude value (peak) may be selected for each bin of the combined velocity profile.
  • the maximum amplitude bin values may be used in a feature vector to classify the object.
  • the feature vector may include the values ⁇ Bomax, Bimax. ...,
  • Radar system 202 is coupled to processor 211. Radar system 202 may provide the feature vector to processor 211 and the processor 211 may apply the feature vector to one of the machine learning models 205 that is trained to determine whether the object is a UAV or not a UAV.
  • the one or more machine learning models 205 may be trained to label one or more objects. For example, a machine learning model may be trained to label an object as a "UAV" or “not a UAV.” A machine learning model may be trained to label an object as a "bird” or “not a bird.” A machine learning model may be trained to label an object as a "balloon" or "not a balloon.”
  • the one or more machine learning models 205 may be configured to implement one or more machine learning algorithms (e.g., support vector machine, soft max classifier, autoencoders, naive bayes, logistic regression, decision trees, random forest, neural network, deep learning, nearest neighbor, etc.).
  • the one or more machine learning models 205 may be trained using a set of training data.
  • the set of training data includes a set of positive examples and a set of negative examples.
  • the set of positive examples may include a plurality of feature vectors that indicate the detected object is a UAV.
  • the set of negative examples may include a plurality of feature vectors that indicate the detected object is not a UAV.
  • the set of negative examples may include feature vectors associated with a balloon, bird, plane, helicopter, etc.
  • UAVs may be provided to one or more other machine learning models that are trained to identify specific UAV models.
  • the velocity profile of a UAV may follow a general micro-Doppler signature, but within the general micro-Doppler signature, different types of UAVs may be associated with different micro-Doppler signatures.
  • the offset difference between a bin corresponding to a baseline velocity and a bin corresponding to a secondary velocity may have a first value for a first UAV and a second value for a second UAV.
  • Processor 211 may provide the output from the one or more machine learning models 205 to interdiction system 207.
  • Interdiction system 207 includes a capture net launcher 208, one or more sensors 209, and a control system 210.
  • the control system 210 may be configured to monitor signals received from the one or more sensors 209 and/or radar system 202, and control the capture net launcher 208 to automatically deploy the capture net when predefined firing conditions are met.
  • One of the predefined firing conditions may include an identification of a target UAV.
  • One of the predefined firing conditions may include a threshold range between the target UAV and UAV 200.
  • the one or more sensors 209 may include a global positioning system, a light detection and ranging (LIDAR) system, a sounded navigation and ranging (SONAR) system, an image detection system (e.g., photo capture, video capture, UV capture, IR capture, etc.), sound detectors, one or more rangefinders, etc. For example, eight LIDAR or RADAR beams may be used in the rangefinder to detect proximity to the target UAV.
  • the one or more sensors 209 may include image capture sensors which may be controlled by the interdiction control system 210 to capture images of the object when detected by the range finding sensors. Based on the captured image and the range readings from the ranging sensors, the interdiction system may identify whether or not the object is the target UAV that is identified by radar system 202.
  • interdiction control system 210 determines that the object is a target UAV, it may also determine if the target UAV is in optimal capture position relative to the defending UAV. If the relative position between the target UAV and the defending UAV is not optimal, interdiction control system 210 may provide a recommendation or indication to the remote controller of the UAV. Interdiction control system 210 may provide or suggest course corrections directly to the processor 211 to maneuver the UAV into an ideal interception position autonomously or semi-autonomously. Once the ideal relative position between the target UAV and the defending UAV is achieved, interdiction control system 210 may automatically trigger capture net launcher 208. Once triggered, capture net launcher 208 may fire a net designed to ensnare the target UAV and disable its further flight.
  • the net fired by the capture net launcher may include a tether connected to UAV
  • the tether may be connected to the defending UAV by a retractable servo controlled by the control system 210 such that the tether may be released based on a control signal from the control system 210.
  • Control system 210 may be configured to sense the weight, mass, or inertia effect of a target UAV being tethered in the capture net and recommend action to prevent the tethered target UAV from causing UAV 200 to crash or lose maneuverability. For example, control system 210 may recommend UAV 200 to land, release the tether, or increase thrust.
  • Control system 210 may provide a control signal to the UAV control system (e.g., processor 211) to allow the UAV to autonomously or semi-autonomously take corrective actions, such as initiating an autonomous or semi-autonomous landing, increasing thrust to maintain altitude, or releasing the tether to jettison the target UAV in order to prevent the defending UAV from crashing.
  • the UAV control system e.g., processor 211
  • Unmanned Aerial Vehicle 200 may include a camera system 212. Camera system
  • Camera system 212 may be used to visually detect a UAV.
  • Camera system 212 may visually detect an object and provide visual data (e.g., pixel data) to one of the one or more machine learning models 205.
  • a machine learning model may be trained to label an object as "a UAV" or "not a UAV” based on the visual data. For example, a set of positive examples (e.g., images of UAVs) and a set of negative examples (e.g., images of other objects) may be used to train the machine learning model.
  • Processor 211 may use the output from the machine learning model trained to label an object as a UAV based on the radar data and/or visual data to determine whether to activate the interdiction system 207. Processor 211 may activate interdiction system 207 in the event the machine learning model trained to label an object as a UAV based on radar data and the machine learning model trained to label the object as a UAV based on visual data indicate that the object is a UAV.
  • UAV 200 may use radar system 202 to detect an object that is greater than a threshold distance away.
  • UAV 200 may use camera system 212 to detect an object that is less than or equal to the threshold distance away.
  • UAV 200 may use both radar system 202 and camera system 212 to confirm that a detected object is actually a UAV. This reduces the number of false positives and ensures that the capture active mechanism is activated for actual UAVs.
  • Figure 3 A is a diagram illustrating a front view of a UAV in accordance with some embodiments.
  • Unmanned aerial vehicle 301 is an example of UAV 200 of Figure 2 and/or aerial vehicle 102 of Figure 1
  • front view 300 includes unmanned aerial vehicle 301 comprising computing chassis 302, first rotor 303a, second rotor 303b, first motor 304a, second motor 304b, first antenna 305a, second antenna 305b, first landing strut 306a, second landing strut 306b, first net launcher 307a, second net launcher 307b, first guide collar 309a, second guide collar 309b, interdiction sensor module 308, first structural isolation plate 310, visual detection system 311, disruption signal antenna 312, antenna clip 313, one or more cooling fans 314, first rotor arm bracket 315a, second rotor arm bracket 315b, first rotor arm 316a, second rotor arm 316b, second structural isolation plate 320, vibration isolation plate 330, vibration isolation plate 340, vibration isolation plate 350, and dampers 351.
  • unmanned aerial vehicle 301 comprising computing chassis 302, first rotor 303a, second rotor 303b, first motor 304a, second motor 304b, first antenna 305a, second
  • Computing chassis 302 is configured to protect the CPU of UAV 301.
  • the CPU is configured to control the overall operation of UAV 301.
  • the CPU may be coupled to a plurality of computing modules.
  • the plurality of computing modules may include an interdiction control module, an image processing module, a safety module, a flight recorder module, etc.
  • the CPU may provide one or more control signals to each of the plurality of computing modules.
  • the CPU may provide a control signal to the interdiction control module to activate one of the net launchers 307a, 307b to deploy a net.
  • the CPU may provide a control signal to the image processing module to process an image captured by the visual detection system 311.
  • the CPU may be configured to perform one or more flight decisions for the UAV.
  • the CPU may provide one or more flight commands to a flight controller module.
  • a flight command may include a specified speed for the UAV, a specified flight height for the UAV, a particular flight path for the UAV, etc.
  • the flight controller module is configured to control the motors associated with the UAV (e.g., motors 304a, 304b) so that UAV 301 flies in a manner that is consistent with the flight commands.
  • the CPU is configured to receive flight instructions from a remote command center. In other embodiments, the CPU is configured to autonomously fly UAV 301.
  • the image processing module is configured to process images acquired by visual detection system 311.
  • the image processing module may be configured to determine whether a visually detected object is a UAV based on the visual data associated with the detected object.
  • the image processing module may include a plurality of machine learning models that are trained to label a detected object based on the visual data.
  • the image processing module may include a first machine learning model that is configured to label objects as a UAV, a second machine learning model that is configured to label objects as a bird, a third machine learning model that is configured to label objects as a plane, etc.
  • First structural isolation plate 310 is configured to isolate computing chassis 302 and its associated computing components from one or more noisy components. First structural isolation plate 310 is also configured to isolate the one or more noisy components from the electromagnetic interference noise associated with the computing components of computing chassis 302.
  • the one or more noisy components isolated from computing chassis 302 and its associated computing components by first structural isolation plate 310 may include a communications radio (not shown in the front view) and a communications disruption signal generator (not shown in the front view).
  • First structural isolation plate 310 may include a foil made from a particular metallic material (e.g., copper) and the foil may have a particular thickness (e.g., 0.1 mm). First structural isolation plate 310 and second structural isolation plate 320 may act as a structural frame for UAV 301. First structural isolation plate 310 may be coupled to second structural isolation plate 320 via a plurality of rotor arm brackets (e.g., rotor arm brackets 315a, 315b) and a plurality of side wall components (not shown in the front view). The rotor arm brackets are coupled to a corresponding rotor arm. The first structural isolation plate 310 may be attached to one or more rotor arm clips (not shown in the front view).
  • a foil made from a particular metallic material (e.g., copper) and the foil may have a particular thickness (e.g., 0.1 mm).
  • First structural isolation plate 310 and second structural isolation plate 320 may act as a structural frame for UAV 301.
  • the one or more rotor arm clips are configured to lock and unlock corresponding rotor arms of UAV 301.
  • the one or more rotor arm clips are configured to lock the rotor arms in a flight position when UAV 301 is flying.
  • the one or more rotor arm clips are configured to unlock the rotor arms from a flight position when UAV 301 is not flying.
  • the rotor arms may be unlocked from the rotor arm clips when UAV 301 is being stored or transported to different locations.
  • First structural isolation plate 310 is coupled to vibration isolation plate 330 via a plurality of vibration dampers.
  • First structural isolation plate 310 may be coupled to one or more dampers configured to reduce the amount of vibration to which a plurality of vibration sensitive components are subjected.
  • the plurality of vibration sensitive components may include the computing modules included in computing chassis 302, connectors, and heat sinks. The performance of the vibration sensitive components may degrade when subjected to vibrations.
  • the one or more dampers may be omnidirectional dampers. The one or more dampers may be tuned to the specific frequency associated with a vibration source.
  • the vibrations may be mechanical vibrations caused by the motors of the UAV (e.g., motors 304a, 304b) and the rotors of the UAV (e.g., rotors 303a, 303b).
  • First structural isolation plate 310 in combination with vibration isolation plate 330 and the plurality of dampers are configured to shield the plurality of computing components from vibrations, noise, and EMI.
  • Vibration isolation plate 330 is coupled to antenna 312 associated with a communications disruption signal generator.
  • Antenna 312 may be a highly directional antenna (e.g., log periodic, parabolic, helical, yagi, phased array, horn, etc.) that is configured to transmit a communications disruption signal.
  • the communications disruption signal may have a frequency associated with one or more wireless communications devices that the communications disruption signal is attempting to disrupt.
  • the communications disruption signal may have a frequency between 2.1 GHz and 5.8 GHz.
  • UAV 301 includes second structural isolation plate 320.
  • a UAV may also be designed to include an isolation plate to isolate the noisy components from the radiating components and vice versa.
  • Second structural isolation plate 320 is configured to isolate the one or more noisy components from one or more antennas and one or more sensors and vice versa.
  • Second structural isolation plate 320 is also configured to act as a ground plane for the one or more antennas associated with a radio communications system of UAV 301.
  • Structural isolation plate 320 may also be coupled to one or more dampers to reduce an amount of vibration to which the noisy components are subjected.
  • the combination of structural isolation plate 310 and structural isolation plate 320 acts as a Faraday cage for the noisy components.
  • the combination of structural isolation plate 310 and structural isolation plate 320 is configured to isolate one or more high noise generating components of the UAV from the other components of the UAV.
  • a radio communications system and a communication disruption signal generator may be isolated from a plurality of computing components and a plurality of antennas. As a result, the influence that vibrations, noise, and EMI have on the overall performance of the UAV is reduced.
  • One or more cooling fans 314 are coupled to and may be positioned in between vibration isolation plate 330 and vibration isolation plate 340.
  • the high noise generating components of the UAV may generate a lot of heat during operation.
  • One or more cooling fans 314 are configured to direct air towards the high noise generating components such that a temperature of the high noise generating components of the UAV is reduced during operation.
  • a portion of the one or more cooling fans 314 may be placed adjacent to one of the openings of the structural frame comprising first structural isolation plate 310 and second structural isolation plate 320.
  • First rotor arm bracket 315a is coupled to first rotor arm 316a and second rotor arm bracket 315b is coupled to second rotor arm 316b.
  • First rotor arm 316a is coupled to motor 304a and rotor 303a.
  • Second rotor arm 316b is coupled to motor 304b and rotor 303b.
  • Rotor arm brackets 315a, 315b are configured to engage rotor arms 316a, 316b, respectively.
  • UAV 301 may lift off from a launch location and fly when rotor arms 316a, 316b are engaged with their corresponding rotor arm brackets 315a, 315b.
  • motors 304a, 304b may provide a control signal to rotors 303a, 303b to rotate.
  • a radio communications system of UAV 301 may be associated with a plurality of antennas (e.g., antenna 305a, antenna 305b). Each antenna may operate at a different frequency. This enables the radio communications system to switch between frequency channels to communicate.
  • the radio communications system may communicate with a remote server via antenna 305a.
  • the radio communications system may transmit the data associated with the one or more sensors associated with UAV 301 (e.g., radar data, lidar data, sonar data, image data, etc.).
  • the frequency channel associated with antenna 305a may become noisy.
  • the radio communications system may switch to a frequency channel associated with antenna 305b.
  • the antennas associated with the radio communications system may be daisy chained together.
  • the persistent systems radio may communicate with one or more other UAVs and transmit via antennas 305a, 305b a signal back to a source through the one or more other UAVs.
  • another UAV may act as an intermediary between UAV 301 and a remote server.
  • UAV 301 may be out of range from the remote server to communicate using antennas 305a, 305b, but another UAV may be in range to communicate with UAV 301 and in range to communicate with the remote sever.
  • UAV 301 may transmit the data associated with one or more sensors to the other UAV, which may forward the data associated with one or more sensors to the remote server.
  • the radio communications system of UAV 301 may be associated with three antennas (e.g., antenna 305a, antenna 305b, antenna 305c).
  • the antennas may be approximately 90 degrees apart from each other (e.g., 90° ⁇ 5°).
  • the antennas may be coupled to the landing struts of UAV 301 (e.g., landing strut 306a, landing strut 306b, landing strut 306c) via an antenna clip, such as antenna clip 313.
  • This allows the antennas to have a tripod configuration, which allows the antennas to have enough fidelity to transmit the needed bandwidth of data.
  • the tripod configuration allows the antennas to have sufficient bandwidth to transmit video data or any other data obtained from the one or more sensors of UAV 301.
  • UAV 301 may include a fourth antenna (not shown) that is also coupled to one of the landing struts of UAV 301.
  • UAV 301 may be remotely controlled and the fourth antenna may be used for remote control communications.
  • the antennas coupled to the landing struts of UAV 301 may be integrated into the landing strut, such that an antenna is embedded within a landing strut.
  • UAV 301 may include guide collars 309a, 309b.
  • Guide collars 309a, 309b may be coupled to a plurality of launch rails.
  • UAV 301 may be stored in a hangar that includes the plurality of launch rails.
  • Guide collars 309a, 309b are hollow and may be configured to slide along the launch rails to constrain lateral movement of UAV 301 until it has exited the housing or hangar.
  • UAV 301 may include a vibration isolation plate 350 that is coupled to a battery cage via a plurality of dampers 351.
  • the vibration plate 350 may be coupled to net launchers 307a, 307b and interdiction sensor system 308.
  • Interdiction sensor system 308 may include at least one of a global positioning system, a radio detection and ranging (RADAR) system, a light detection and ranging (LIDAR) system, a sounded navigation and ranging (SONAR) system, an image detection system (e.g., photo capture, video capture, UV capture, IR capture, etc.), sound detectors, one or more rangefinders, etc.
  • RADAR radio detection and ranging
  • LIDAR light detection and ranging
  • SONAR sounded navigation and ranging
  • an image detection system e.g., photo capture, video capture, UV capture, IR capture, etc.
  • sound detectors e.g., one or more rangefinders, etc.
  • eight LIDAR or RADAR beams may be used in the
  • Interdiction sensor system 308 may include one or more LEDs that indicate to bystanders whether UAV 301 is armed and/or has detected a target.
  • the one or more LEDs may be facing away from the back of UAV 301 and below UAV 301. This enables one or more bystanders under UAV 301 to become aware of a status associated with UAV 301.
  • Interdiction sensor system 308 may include image capture sensors which may be controlled by the interdiction control module to capture images of the object when detected by the range finding sensors. Based on the captured image and the range readings from the ranging sensors, the interdiction control module may identify whether or not the object is a UAV and whether the UAV is a UAV detected by one of the sensor systems.
  • the interdiction control module determines that the object is a target UAV, it may also determine if the target UAV is in optimal capture position relative to the defending UAV. The position between UAV 301 and the target UAV may be determined based on one or more measurements performed by interdiction sensor system 308. If the relative position between the target UAV and the defending UAV is not optimal, the interdiction control module may provide a recommendation or indication to the remote controller of the UAV. An interdiction control module may provide or suggest course corrections directly to the flight controller module to maneuver UAV 301 into an ideal interception position autonomously or semi-autonomously.
  • the interdiction control module may automatically trigger one of the net launchers 307a, 307b. Once triggered, one of the net launchers 307a, 307b may fire a net designed to ensnare the target UAV and disable its further flight.
  • the net fired by the capture net launcher may include a tether connected to UAV
  • the tether may be connected to the defending UAV by a retractable servo controlled by the interdiction control module such that the tether may be released based on a control signal from the interdiction control module.
  • the CPU of the UAV may be configured to sense the weight, mass, or inertia effect of a target UAV being tethered in the capture net and recommend action to prevent the tethered target UAV from causing UAV 301 to crash or lose maneuverability. For example, the CPU may recommend UAV 301 to land, release the tether, or increase thrust.
  • the CPU may provide a control signal to allow the UAV to autonomously or semi- autonomously take corrective actions, such as initiating an autonomous or semi-autonomous landing, increasing thrust to maintain altitude, or releasing the tether to jettison the target UAV in order to prevent the defending UAV from crashing.
  • UAV 301 may include visual detection system 311.
  • Visual detection system 311 may include one or more cameras. Visual detection system 311 may be used by a remote operator to control a flight path associated with UAV 301.
  • Visual detection system 311 may provide visual data to an image processing module configured to visually detect an object and provide visual data (e.g., pixel data) to one or more machine learning models. The one or more machine learning models may be trained to label an object as a UAV based on the visual data.
  • the image processing module may provide an output indicating that an object is labeled as a UAV to the interdiction control module.
  • the interdiction control module may be configured to activate net launchers 307a, 307b based on the label.
  • the interdiction control module may output a control signal that causes one of the net launchers 307a, 307b to deploy a net.
  • FIG. 3B is a diagram illustrating a side view of a UAV in accordance with some embodiments.
  • unmanned aerial vehicle 301 may be used to implement a UAV, such as UAV 100.
  • side view 360 includes unmanned aerial vehicle 301 comprising computing chassis 302, UI panel 350, flight controller module 352, second rotor 303b, third rotor 303c, second motor 304b, third motor 304c, second antenna 305b, third antenna 305c, second landing strut 306b, third landing strut 306c, battery 317, battery cage 318, second net launcher 307b, interdiction sensor module 308, second guide collar 309b, first structural isolation plate 310, visual detection system 311, disruption signal antenna 312, antenna clip 313, second structural isolation plate 320, gimbal 335, tether mechanism 325, vibration dampers 332a, 332b, vibration isolation plate 340, and vibration isolation plate 350.
  • unmanned aerial vehicle 301 comprising computing chassis 302, UI panel 350, flight controller module 352, second rotor 303b, third rotor 303c, second motor 304b, third motor 304c, second antenna 305b, third antenna 305c, second landing strut 306b, third landing
  • UI panel 350 is coupled to a safety module that is included in computing chassis
  • UI panel 350 comprises one or more switches, knobs, buttons that enable an operator to arm and disarm UAV 301.
  • An operator may interact with UI panel 350 and based on the operator interactions, the safety module is configured to arm disarm UAV 301.
  • first net launcher 307a and second net launcher 307b may be disarmed based on one or more interactions of an operator with UI panel 350. This may allow the operator to inspect and/or perform maintenance on UAV 301.
  • Flight controller module 352 is configured to control a flight of UAV 301.
  • the flight controller module may provide one or more control signals to the one or more motors (e.g., 304a, 304b) associated with UAV 301.
  • the one or more control signals may cause a motor to increase or decrease its associated revolutions per minute (RPM).
  • UAV 301 may be remotely controlled from a remote location.
  • UAV 301 may include an antenna that receives flight control signals from the remote location.
  • the CPU of UAV 301 may determine how UAV 301 should fly and provide control signals to flight controller module 352.
  • flight controller module 352 is configured to provide control signals to the one or more motors associated with UAV 301, causing UAV 301 to maneuver as desired by an operator at the remote location.
  • Antenna 305c is coupled to landing strut 306c.
  • Antenna 305c is one of the antennas associated with a communications radio system of UAV 301.
  • Antenna 305c is configured to operate at a frequency that is different than antennas 305a, 305b.
  • a communications radio system may be configured to switch between frequency channels to communicate.
  • the communications radio system may communicate with a remote server via antenna 305a.
  • the frequency channel associated with antenna 305a may become noisy.
  • the radio communications system may transmit the data associated with the one or more sensors associated with UAV 301 (e.g., radar data, LIDAR data, sonar data, image data, etc.).
  • the radio communications system may switch to a frequency channel associated with antenna 305b.
  • the frequency channel associated with antenna 305b may become noisy.
  • the radio communications system may switch to a frequency channel associated with antenna 305c.
  • Battery 317 is configured to provide power to UAV 301.
  • UAV 301 is comprised of a plurality of components that require electricity to operate.
  • Battery 317 is configured to provide power to the plurality of components.
  • battery 317 is a rechargeable battery.
  • Battery 317 is housed within battery cage 318.
  • Battery cage 318 may be coupled to vibration isolation plate 350 via a plurality of dampers. Vibration isolation plate 350 may be coupled to interdiction sensor module 308, net launchers 307a, 307b, tether mechanism 325, and a persistent availability plug.
  • Gimbal 335 is coupled to visual detection system 311 and second structural isolation plate 320.
  • a gimbal is a pivoted support that allows the rotation of visual detection system 311 about a single axis.
  • Gimbal 335 is configured to stabilize an image captured by visual detection system 311.
  • Tether mechanism 325 is coupled to net capture launchers 307a, 307b. When a net is deployed by one of the net capture launchers 307a, 307b, the net remains tethered to UAV 301 via tether mechanism 325.
  • Tether mechanism 325 may be configured to sense the weight, mass, or inertia effect of a target UAV being tethered in the capture net.
  • a CPU of UAV 301 may be configured to recommend action to prevent the tethered target UAV from causing UAV 301 to crash or lose maneuverability. For example, the CPU of UAV 301 may recommend UAV 301 to land, release the tether, or increase thrust.
  • the CPU of UAV 301 may provide a control signal to allow the UAV to autonomously or semi-autonomously take corrective actions, such as initiating an autonomous or semi-autonomous landing, increasing thrust to maintain altitude, or releasing the tether to jettison the target UAV in order to prevent the defending UAV from crashing.
  • Vibration dampers 332a, 332b are coupled to structural isolation plate 310 and vibration isolation plate 330. Vibration dampers 332a, 332b may be omnidirectional dampers. Vibration dampers 332a, 332b may be configured to reduce the amount of vibration to which a plurality of vibration sensitive components are subjected.
  • the plurality of vibration sensitive components may include different electronics modules (e.g., components included in computing chassis 302, connectors, and heat sinks). The performance of the vibration sensitive components may degrade when subjected to vibrations. Vibration dampers 332a, 332b may be tuned to the specific frequency associated with a vibration source.
  • the vibrations may be mechanical vibrations caused by the motors of the UAV (e.g., motors 304a, 304b) and the rotors of the UAV (e.g., rotors 303a, 303b).
  • Vibration dampers 332a, 332b may be tuned to the mechanical vibrations caused by the motors of the UAV and the rotors of the UAV.
  • Vibration dampers 332a, 332b may be comprised of a vibration damping material, such as carbon fiber.
  • one or more vibration dampers may be included in between a motor and a motor mount.
  • Figure 4 is a flowchart illustrating an embodiment of a process for automatically controlling flight of a vehicle.
  • the process of Figure 4 is performed by interceptor aerial vehicle 102 of Figure 1, UAV 200 of Figure 2, and/or UAV 301 of Figures 3 A and 3B.
  • a location of a target is received.
  • An example of the location is geographical location coordinate and an example of the target is a vehicle detected to be not allowed in a monitored airspace (e.g., target aerial vehicle 110 of Figure 1).
  • the location is a geographical location coordinate that has been determined using one or more remote sensors.
  • the location of target aerial vehicle 110 is provided by ground station 104 via a wireless communication to interceptor aerial vehicle 102, and ground station 104 has determined the location based on information from sensors of ground station 104 and/or ground-based sensor 106 of Figure 1.
  • the location is received directly from one or more different sensors (e.g., from one or more ground-based sensors).
  • the location is relative to a reference aerial vehicle.
  • the location e.g., including azimuth angle, altitude angle, distance
  • an on-vehicle sensor e.g., radar sensor of UAV 301).
  • the received location has been filtered (received as filtered) and/or is filtered (e.g., filtered using a processor on an aerial vehicle) based on one or more sensor measurements. For example, once a likely geographical location has been determined for the target, the location is filtered using a Kalman filter (e.g., linear Kalman filter) to reduce noise and inaccuracies.
  • the filter may take into account a model of a target vehicle and its movement/flight properties and capabilities to determine the location based on a series of sensor measures over time.
  • the location of the target may be updated and continually received as a new or updated location of the target is determined. For example, the received location is a part of a stream of locations tracked for the target.
  • the received location is utilized to navigate towards the received location.
  • the location is received at a navigation component and/or a flight controller of an interceptor aerial vehicle and the direction and/or speed of the aerial vehicle is automatically and autonomously adjusted to turn and fly towards the received location in a navigation mode based on the received location.
  • This allows the interceptor aerial vehicle to be approached and become closer to the vicinity of the target.
  • the received location may represent a rough and/or approximate location of the target, the received location may not represent the exact location of a moving target required to be known to successfully deploy an interdiction/capture mechanism.
  • a relative location of the target is detected using a captured image.
  • a reference system e.g., interceptor aerial vehicle 102 of Figure 1
  • the reference system is able to more accurately detect and automatically navigate to the location of the target based at least in part on the image sensor/camera data.
  • captured images are analyzed using a processor of the reference system in an attempt to detect the target aerial vehicle in the image using computer machine vision.
  • the reference system continues to proceed towards the received location in the received location-based navigation mode. However, once the target is detected in a captured image, the navigation of the reference system may now be performed based on the image rather than or in addition to the received location. If the image of the target is lost, the reference system may fall back to the previous received location-based navigation until the target is detected again in a captured image.
  • the detection of the target in the captured image may be performed using machine/computer vision. For example, using a machine learning model trained using training data of various different example targets, captured images of one or more image sensors are analyzed to identify a likely target in the image. In some embodiments, the relative location is specified by an azimuth angle and an altitude angle.
  • a distance value to the target and/or desired speed/acceleration may also be specified for the relative location.
  • the relative location is specified by a three-dimensional vector determined based on the detected image of the target.
  • the three-dimensional vector is specified by values corresponding to three axial values.
  • a flight operation is performed based on the detected relative location of the target. For example, a flight direction and/or speed of an interceptor aerial vehicle are automatically and autonomously adjusted based on a location and size of the target within the captured image (e.g., adjust direction to center the target within subsequently captured images). This allows the flight path of the interceptor aerial vehicle to be dynamically adjusted to track and follow a moving target, allowing the interceptor aerial vehicle to come within a threshold distance where an interdiction/capture system (e.g., a net fired from the interceptor aerial vehicle) can be effectively deployed.
  • an interdiction/capture system e.g., a net fired from the interceptor aerial vehicle
  • a scaled three-dimensional vector based on the relative location of the target and determined based on the detected image of the target is used to adjust a flight path of the interceptor aerial vehicle in the direction and intensity of the three-dimensional vector.
  • the three-dimensional vector specifies directional and speed/acceleration changes to be performed by the interceptor aerial vehicle to direct the flight path of the interceptor aerial vehicle at a desired rate of change towards the target aerial vehicle.
  • Figure 5 is a flowchart illustrating an embodiment of a process for determining a vector associated with a target.
  • the process of Figure 5 is performed by interceptor aerial vehicle 102 of Figure 1, UAV 200 of Figure 2, and/or UAV 301 of Figures 3 A and 3B.
  • at least a portion of the process of Figure 5 is performed in 406 and/or 408 of Figure 4.
  • the process of Figure 5 may be repeated when a target aerial vehicle is detected in an image being continually captured by an interceptor aerial vehicle.
  • a relative distance to a target is determined based on a captured image of the target.
  • the image is acquired from a camera onboard an interceptor aerial vehicle (e.g., camera 212 of Figure 2).
  • an image sensor/camera of the interceptor aerial vehicle continually captures a stream of images in one or more directions including in the direction of travel of the interceptor aerial vehicle and each captured image is analyzed to identify a target aerial vehicle in the image, if possible.
  • the detection of the target in the captured image may be performed using machine/computer vision. For example, using a machine learning model trained using training data of various different images of targets, the captured image is analyzed to identify the target aerial vehicle in the image.
  • the output of the image analysis may include a bounding box outlining an area within the image that includes the image portion with the target (e.g., bounding box outlines the minimum sized box that includes detected features of the target) and/or a classification of the detected target (e.g., type, model, class, manufacturer, size, etc. of the detected target).
  • a bounding box outlining an area within the image that includes the image portion with the target e.g., bounding box outlines the minimum sized box that includes detected features of the target
  • a classification of the detected target e.g., type, model, class, manufacturer, size, etc. of the detected target.
  • the relative distance to the target identifies the relative distance between a reference system (e.g., interceptor aerial vehicle) and the target (e.g., target aerial vehicle).
  • the relative distance may be determined based on a size of the bounding box within the entire captured image and a detected classification of the target. For example, the target that is further away will appear smaller in the image as compared to the target that is closer to the reference location, and thus the size of the detected target within the image can be used as a proxy for the distance to the target.
  • different types of target aerial vehicles may be different sizes and the actual physical size of the target needs to be taken into account when using the target size within the image as a proxy for relative distance.
  • the machine learning model has been trained using different examples of types of aerial vehicles (e.g., different models, types, manufacturers, sizes, etc.) to be able to classify and identify the specific type of the detected target aerial vehicle and a corresponding size profile corresponding to the specific type. If a specific type of the target cannot be reliably determined, a default size profile corresponding to a general type may be utilized.
  • types of aerial vehicles e.g., different models, types, manufacturers, sizes, etc.
  • a measurement of the target within the image can be mapped to a distance value based on a determined size corresponding to the detected type/classification of the target.
  • a table and or formula for the mappings may be predetermined for different types/classifications.
  • the mapping to the distance value may also take into account one or more properties of the camera utilized to capture the image.
  • the senor size, sensor pixel configuration, a zoom setting, and/or a field of view of the camera is utilized in adjusting or normalizing the distance value and/or the measurement of the target image portion within the overall image.
  • images from different cameras are utilized and analyzed together to determine the direction to the target.
  • a relative direction to the target is determined based on the image. Because the camera that captured the image is affixed to the reference aerial vehicle at a known location and orientation, the direction captured by each pixel of the camera sensor is known and can be predetermined. For example, based on the known height and width of the image sensor (e.g., number of pixels in height and width), optical properties of the camera lens (e.g., the field of view of the camera), and the capture direction/orientation of the camera with respect to the reference aerial vehicle, each location within the image (e.g., each pixel) can be mapped to a specific relative direction with respect to the reference aerial vehicle.
  • the known height and width of the image sensor e.g., number of pixels in height and width
  • optical properties of the camera lens e.g., the field of view of the camera
  • each location within the image e.g., each pixel
  • the output of the image analysis may include a bounding box identifying an area within the image that includes the target and the center of the bounding box can be used as a representative central point for the target.
  • another type of point on the target within the image may be selected as the representative point for the target (e.g., specifically identified point on a body of the target).
  • the location (e.g., x and y location) of this representative point within the image is determined (e.g., pixel location). This two-dimensional location represents the relative direction of the target with respect to the reference as this two-dimensional location can be mapped to a corresponding azimuth angle and altitude angle.
  • a table and/or formula for the mapping between the pixel/image location and corresponding azimuth angle and altitude angle may be predetermined.
  • images from different cameras are utilized and analyzed together to determine the relative direction of the target.
  • a vector identifying the relative location of the target is determined using at least the determined relative direction of the target. For example, determined angular directions (e.g., azimuth angle and altitude angle) can be mapped to a three-dimensional vector identified by three axial components (e.g., x-axis, y-axis, and z-axis values). In some embodiments, a predetermined relationship between the relative direction angles and the vector components are utilized to determine the vector. In some embodiments, the vector is at least in part defined using a distance and/or speed value.
  • the distance value determined in 502 is mapped to a desired speed of travel to reach the target (e.g., greater speed desired for further distance and slower speed desired for shorter distance to allow more time for direction correction) and the length of the vector is based on the determined distance and/or speed.
  • the vector is defined by the azimuth angle, the altitude angle, and a factor based on an associated distance or speed.
  • the vector is defined by three-dimensional coordinate-based values and the vector is to be provided along with the associated distance and/or speed.
  • scaling the vector includes modifying a speed and/or acceleration associated with the vector as well as smoothing and limiting the rate of change associated with the vector. In some embodiments, the scaling the vector includes modifying one or more directional component values of the vector.
  • Modifying the speed and/or acceleration associated with the vector includes adjusting a factor used to control the speed and/or acceleration of the interceptor aerial vehicle based on an amount of directional change specified by the vector. For example, when a large directional change is required by the interceptor aerial vehicle to fly towards the target, a slower speed is desirable to allow the directional change to take place at a slower speed over a shorter distance as compared to a faster speed that is desirable to reach the target quicker when no or minor directional change is required.
  • the speed and/or acceleration factor is multiplied by one or more scalar values that change in magnitude based on a directional deviation of the vector from a current direction of movement (e.g., deviation of the target in the image from the center of the image representing the current heading direction of the interceptor aerial vehicle). For example, plotting the scalar value with respect to a directional deviation results in a bell curve where the highest scalar value is associated with the center of the graph corresponding to no change in directional deviation (e.g., the directional deviation given by a component value of the vector) and the scalar value decreases exponentially on either side of the center as the directional deviation increases.
  • a directional deviation of the vector from a current direction of movement e.g., deviation of the target in the image from the center of the image representing the current heading direction of the interceptor aerial vehicle.
  • the scalar value may decrease linearly on either side of the center as the directional deviation increases.
  • the scalar value is a function of one or more of a plurality of directional axes of the vector.
  • the scalar value is a combination of one scalar value that varies based on a relationship (e.g., function) with one component value of the vector and another scalar value that varies based on a different relationship (e.g., a different function) with another component value of the vector.
  • the factor used to control the speed and/or acceleration of the interceptor aerial vehicle includes limiting its rate of change and/or value (e.g., either by a reduction scalar or a maximum change limit) so that the interceptor aerial vehicle does not change its speed in a manner that is too sudden or difficult to achieve given flight hardware limitations. For example, a magnitude of change of the speed and/or acceleration is allowed up to a predetermined maximum limit.
  • the scaling the vector includes modifying one or more directional component values of the vector.
  • a rate of change and/or a value of one or more components of the vector is limited (e.g., either by a reduction scalar value or a limit) so that the interceptor aerial vehicle does not change directions in a manner that is too sudden or difficult to achieve given flight hardware limitations.
  • a magnitude of change of a component value of the vector is limited by a predetermined maximum limit.
  • the one or more component values of the vector are multiplied by one or more scalar values that change in magnitude based on a deviation of the corresponding component value from a center reference value (e.g., corresponding to a current direction of movement).
  • plotting the scalar value with respect to a deviation results in a bell curve where the highest scalar value is associated with the center of the graph corresponding to no change in deviation from a reference and the scalar value decreases exponentially on either side of the center as the deviation increases.
  • the scalar value may decrease linearly on either side of the reference as the deviation increases.
  • the scalar value may be specified to a particular directional axis of the vector. For example, different scalar value functions correspond to different component values of the vector.
  • the scaled vector and one or more associated values are provided.
  • the scaled vector and associated directional values and distance, speed, and/or acceleration values are provided for use in navigating the interceptor aerial vehicle towards the target aerial vehicle.
  • the scaled vector and one or more associated values are provided for use in performing the flight operation in 408 of Figure 4.
  • FIG. 6 is an illustration illustrating an example of a two dimensional vector determined using an image from a camera feed.
  • Graph representation 600 shows target 602, an x- axis 604, y-axis 606, 2D vector 608, degX 610, and degY 612.
  • the target 602 may comprise pixels associated with a target.
  • the number of pixels associated with the target 602 may be utilized to determine the depth of the target 602 by comparing the number to the total number of pixels.
  • the x-axis 604 may measure the number of degrees the target 602 is located left or right of the center of the camera feed (e.g., the degX 610). In some embodiments, the x-axis 604 measures from -21 degrees to 21 degrees.
  • the y-axis 606 may measure the number of degrees the target 602 is located above or below the center of the camera feed (e.g., the degY 612). In some embodiments, the y-axis 606 measures from -35 degrees to 35 degrees.
  • the 2D vector 608 is the combination of the degX 610 and the degY 612.
  • FIG. 7 is an illustration illustrating an example of a three dimensional vector determined using two dimensional measurements.
  • 3D vector 700 comprises a target 702, a UD factor 704, an LR factor 706, and an FB factor 708.
  • Target 702 which is associated with a 2D vector, has a 3D vector determined, which may comprise the UD factor 704, the LR factor 706, and the FB factor 708.
  • the UD factor 704, the LR factor 706, and the FB factor 708 may be calculated using the example equations shown in Figure 8.
  • Figure 8 shows example equations that can be utilized to determine a three dimensional vector.
  • the equations are based on measurements for two axes of an image, such as an x-axis (degX) and a y-axis (degY).
  • An initial 2D vector may be determined based on the location of the target pixels.
  • the 2D vector is then converted into a 3D vector comprising three factors: an up/down factor (UD), a front/back factor (FB), and a left/right factor (LR).
  • the conversion may be performed utilizing the equations shown in Figure 8.
  • the vector comprising UD, FB, and LR, as well as associated speed value may be scaled and/or smoothed.
  • associated speed values are utilized to modify the initial vector components UD, FB, and LR.
  • the speed may be utilized as a scalar multiplier, or the speed may affect the components of the 3D vector differently.
  • FIG. 9 is an illustration illustrating examples of functions utilized to adjust component values of a vector.
  • Logic smoothing functions 900 comprises an x-axis logic smoothing function 902 and a y-axis logic smoothing function 904.
  • the x-axis logic smoothing function 902 and the y-axis logic smoothing function 904 may depend on the x-axis and y-axis, respectively.
  • the x-axis logic smoothing function 902 and the y-axis logic smoothing function 904 may be applied to vectors, such as 3D vector 700.
  • the x-axis logic smoothing function 902 and y-axis logic smoothing function 904 may be applied to one or more of the components of the vectors.
  • the x-axis logic smoothing function 902 may be applied to the LR factor 706 and the FB factor 708, but not the LTD factor 704.
  • the y-axis logic smoothing function 904 may be applied similarly in this example.
  • Figure 10 shows example equations that can utilized to scale/alter components values of a vector.
  • a logistical smoother alters the vector to stabilize the motion of a drone.
  • the logistical smoother may utilize the axial measurements (e.g., degX and degY) to alter the vector.
  • the logistical smoother 110 may also utilize predetermined constants to alter the vector.
  • UD is modified by a constant
  • FB and LR are modified by a speed factor, which depends on both degX and degY, which may be converted to their absolute value prior to calculation, as shown in the equations of Figure 10.
  • the udfactor in Equation 4 may be a constant. Equations 7 and 8 may both be applied when determining the speed factor for Equations 5 and 6.
  • each vector component may utilize a specific component of speed factor (e.g., either Equation 7 or Equation 8).
  • the constants a, b, and c depicted in Equations 7 and 8 may be the same value or they may be different values. That is, each constant for speed factor with respect to degX may be different, and the constant may differ based on whether applied to the speed factor for degX or degY.
  • the final 3D vector can be utilized to perform a flight operation.
  • FIG 11 illustrates an embodiment of a spacio-temporal awareness engine 1100.
  • engine 1100 may be utilized in 406 of Figure 4.
  • the spacio-temporal awareness engine 1100 comprises camera 1 1102, camera 2 1104, camera n 1106, low resolution converter 1108, image 1 1114, image 2 1116, image n 1118, anomaly detect 1120, and camera of interest 1122.
  • the low resolution converter 1108 comprises the noise filter 1110 and feature consolidation 1112.
  • the spacio-temporal awareness engine 1100 may be operated in accordance with the tree based region selection process 1200 and the tree-based region selection process 1300.
  • the spatial-temporal (spacio-temporal) awareness engine utilizes the limited resources available on an autonomous robotic systems (ARS) (e.g. drones, self driving cars, etc.).
  • ARS autonomous robotic systems
  • the spacio-temporal awareness engine utilizes a multi-camera view which is processed in parallel by a cascade of noise removal and super pixel feature consolidation and isolation algorithms to bring them to lower resolution images. These images are processed continuously using proprietary anomaly detection and populated in a probability distribution based priority quadtree and/or octree maps for further processing by the main high resolution tracking engine.
  • An anomaly or change detection algorithm uses a combination of traditional edge and contour-based features in combination with a temporal prediction filter. The advantage of this two-tier architecture is the ability to reject and give prioritized areas for heavier, more computationally intensive algorithms.
  • Deep neural networks may be executed to periodically detect objects and distinguish targets.
  • a main high resolution tracking engine executes high-speed feature-based tracking based on disparity of similarity features with live adaptation.
  • a tracker algorithm takes control and maintains lock on the target.
  • Applying different tracking algorithms and DNN based detections of a target in the video frame provides robustness at a high compute cost.
  • a reduced resource background tracker may incrementally predict the location of a target in the frame with low compute cost and average robustness. This optimization enables the coexistent application of advanced machine vision algorithms in addition specialized lower cost algorithms.
  • An example implementation includes means for determining a response direction to locate and track items of interest to respond to changes in monitored video data.
  • An example implementation includes method comprising optimizing resources processing a video data stream from a mobile capture device using a set of processing schemes to track one or more items of interest.
  • a performance score is associated with each processing scheme confidence to track the one or more items of interest.
  • the method includes repeatedly determining an active processing scheme based on the processing scheme with the highest performance score from the set of processing schemes. In response to the performance score of the active processing scheme failing to satisfy a threshold, the method selects another processing scheme to process the video data stream.
  • Processing the video data stream can include identifying one or more items, classifying each of the items; and tracking one or more of items as an item of interest based on the classification. Processing the video data stream can include noise filtering and feature consolidation.
  • selecting another processing scheme is based on determining a number of items of interest in the video data stream.
  • the set of processing schemes can utilize different resource levels to process the video data stream.
  • the set of processing schemes process the video data stream using different resolutions.
  • the performance scores can be re-calculated based on a trigger, a resource threshold, or a time interval.
  • the threshold can be based on available computing resources associated with the mobile capture device.
  • a system can include a video capture module, a sensor module, a control module, and one or more processors to direct control based on a detected change in a region of interest monitored by the video capture module or the sensor module.
  • the one or more processors are configured to monitor multiple regions of interest in video data from the video capture module, and in response to detecting a change in a region of interest, determine a response direction for the control module based on the sensor module, wherein the response direction indicates an approximate location for an item of interest.
  • the region of interest can include tracking an item of interest and the change in a region of interest includes not detecting the item of interest in the region of interest.
  • the sensor module can be used to detect the response direction in view of a last detected location for the item of interest.
  • the system can control the video capture module, a navigation system of the control module, or feedback interface based on the response direction. For example, based on the response direction indicating an updated location for the item interest, cameras can be moved or re-focused, flight code can be updated, or visual feedback provided directed towards a possible location using the response direction.
  • the updated location can be an approximate or predicted area based on the monitored video data and/or sensor data.
  • the sensor module can include sensors coupled to the control module or the video capture module and/or receive sensor readings from external sensor systems, such as ground-based sensors including radar, radio frequency, proximity, acoustic, thermal imaging, night vision, and global positioning system sensors.
  • a system includes a video capture module and one or more processors configured to process a video data stream using a set of processing schemes to track one or more items of interest, where a performance score is associated with the confidence of each processing scheme to track the one or more items of interest, wherein an active processing scheme is repeatedly determined based on the processing scheme with the highest performance score from the set of processing schemes.
  • the one or more processors select another processing scheme to process the video data stream.
  • the system can include a sensor module, where the one or more processors are further configured to: monitor multiple regions of interest in video data stream; and in response to detecting a change in a region of interest, determine a response direction based on the sensor module, wherein the response direction indicates an approximate location for an item of interest.
  • the sensor module can receive readings from at least one of a radar, a radio frequency, proximity, acoustic, thermal imaging, night vision, and global positioning system sensors.
  • the one or more processors are further configured to at least one of control a navigation system, an interface, and the video capture module based on the response direction.
  • Figure. 12 illustrates an embodiment of a tree based region selection process. At least a portion of the process of Figure 12 may be performed in 406 of Figure 4. Referring to Figure
  • tree based region selection process 1200 receives a high resolution stream from a first imaging sensor.
  • tree based region selection process 1200 generates a low resolution stream from a second imaging sensor using a low-resolution conversion.
  • tree based region selection process 1200 noise filters.
  • tree based region selection process 1200 feature consolidation.
  • tree based region selection process 1200 detects an anomaly within the low resolution stream
  • tree based region selection process 1200 creates a prioritized region surrounding the detected anomaly.
  • tree based region selection process 1200 performs anomaly detection within a corresponding region of the high resolution stream.
  • tree based region selection process 1200 outputs the anomaly location.
  • tree based region selection process 1200 ends.
  • Figure 13 illustrates an embodiment of a tree-based region selection process. At least a portion of the process of Figure 13 may be performed in 406 of Figure 4. Referring to Figure
  • tree based tree-based region selection process 1300 receives a video input from a plurality of cameras. In subroutine block 1304, applies a low-resolution conversion. In subroutine block 1306, the tree-based region selection process 1300 detects anomalies in the video input. In block 1308, the tree-based region selection process 1300 detects anomalies within the region of interest in the high-resolution image. In block 1310 the tree-based region selection process 1300 outputs the anomaly location.
  • Figure 14 illustrates an embodiment of parallel image processing process 1400. At least a portion of the process of Figure 14 may be performed in 406 of Figure 4.
  • the parallel image processing process 1400 comprises the high resolution process 1418 and the low resolution process 1416.
  • the high resolution process 1418 comprises image sensor 1404, anomaly detection 1406, and region of interest 1408.
  • the low resolution process 1416 comprises the image sensor 1402, the low resolution converter 1412, the anomaly detection 1414, and the region of interest 1410.
  • FIG. 15 illustrates a tracking system 1500 in accordance with one embodiment.
  • the tracking system 1500 comprises a cameras 1528 producing multi-camera views 1518 that are input to a processor 1508.
  • the processor operates to filter and de-noise the multi-camera views 1518 to populate a pixel domain 1516.
  • the pixel domain 1516 is divided into nodes (e.g., node 1506, node 1520, node 1514, and node 1512 that are then analyzed by a high resolution tracker 1502.
  • Output of the high resolution tracker 1502 is input to a fast, low power consumption low resolution tracker 1510.
  • the node 1506 comprises an anomaly 1522.
  • the high resolution tracker 1502 identifies the anomaly 1522 as a detected object 1524, which is then tracked by the low resolution tracker 1510.
  • Figure 16 illustrates an embodiment of a quadtree 1600.
  • the quadtree 1600 comprises: node 1604, node 1608, node 1606, subnode 1610, subnode 1612, subnode 1614, n- subnode 1616, n-subnode 1618, n-subnode 1622, and n-subnode 1624.
  • node 1602 may be divided into subnode 1610, subnode 1612, subnode 1614, and subnode 1620.
  • Subnode 1620 may be divided into n-subnode 1616, n-subnode 1618, n-subnode 1622, and n-subnode 1624.
  • an input image is divided into node 1604, node 1608, node 1606, and node 1602. Based on a resolution and probability target, node 1602 is selected as the most likely to contain the drone.
  • FIG. 17 illustrates an embodiment of a system 1700 for converting camera input into a vector for low resolution tracker 1510.
  • One or more components of system 1700 may be utilized in 406 of Figure 4.
  • One of the system cameras e.g., sub-camera 1716
  • macro feature 1710 was pruned as being characteristic of a non-target object (e.g., a bird in flight).
  • Primary macro features are identified from the pruned set as a region of interest 1714 and vectorized (converted from the pixel domain to a vector or parameterized description) into a region of interest 1712 using a feature extraction and classification 1706 process. This results in a final vector 1724 that may be applied to operate a mitigation system and direct the drone's main camera 1720.
  • Figure 18 illustrates an embodiment of a subsystem 1800 for prioritizing a region of interest in the focus of a camera.
  • One or more components of subsystem 1800 may be utilized in 406 of Figure 4.
  • a camera 1836 produces a camera output 1832 that is divided into pixel groups (pixel group 1804, pixel group 1816, pixel group 1818, and pixel group 1820 in this example).
  • the focus pixels 1834 includes pixel group 1816 that is divided into subgroups (pixel group 1808, pixel group 1822, pixel group 1802, etc.)
  • the focus pixels 1824 include pixel group 1808 which is divided into nodes (node 1810, node 1806, node 1826 etc.)
  • the focus pixels 1828 includes node 1810 from which the region of interest 1814 is identified, and focused, to produce focused region of interest 1812. In this manner, the focus of the camera 1836 is progressively narrowed onto the eventual region of interest 1812.
  • Figure 19 illustrates several components of an exemplary region of interest tracking system 1900 in accordance with one embodiment.
  • One or more components of system 1900 may be utilized in 406 of Figure 4.
  • the region of interest tracking system 1900 may be included on a drone device (e.g., as a printed circuit board) to provide the capability to perform operations such as those described herein.
  • region of interest tracking system 1900 may include many more components than those shown in Figure 19. However, it is not necessary that all of these generally conventional components be shown in order to disclose an illustrative embodiment.
  • Collectively, the various tangible components or a subset of the tangible components may be referred to herein as "logic" configured or adapted in a particular way, for example as logic configured or adapted with particular software or firmware.
  • the region of interest tracking system 1900 may comprise one or more physical and/or logical devices that collectively provide the functionalities described herein. In some embodiments, the region of interest tracking system 1900 may comprise one or more replicated and/or distributed physical or logical devices.
  • the region of interest tracking system 1900 may comprise one or more computing resources provisioned from a "cloud computing" provider, for example, Amazon Elastic Compute Cloud (“Amazon EC2”), provided by Amazon.com, Inc. of Seattle, Washington; Sun Cloud Compute Utility, provided by Sun Microsystems, Inc. of Santa Clara, California; Windows Azure, provided by Microsoft Corporation of Redmond, Washington, and the like.
  • Amazon Elastic Compute Cloud (“Amazon EC2")
  • Sun Cloud Compute Utility provided by Sun Microsystems, Inc. of Santa Clara, California
  • Windows Azure provided by Microsoft Corporation of Redmond, Washington, and the like.
  • Region of interest tracking system 1900 includes a bus 1902 interconnecting several components including a network interface 1908, a display 1906, a central processing unit 1910, and a memory 1904.
  • Memory 1904 can comprises a random access memory (“RAM") and permanent non-transitory mass storage device, such as a hard disk drive or solid-state drive.
  • RAM random access memory
  • Memory 1904 stores an operating system 1912. These and other software components may be loaded into a memory 1904 of the region of interest tracking system 1900 using a drive mechanism (not shown) associated with a non-transitory computer-readable medium 1916, such as a memory card, or the like.
  • Memory 1904 also includes database 1914.
  • region of interest tracking system 1900 may communicate with database 1914 via network interface 1908, a storage area network ("SAN"), a high-speed serial bus, and/or via the other suitable communication technology.
  • database 1914 may comprise one or more storage resources provisioned from a "cloud storage” provider, for example, Amazon Simple Storage Service (“Amazon S3”), provided by Amazon.com, Inc. of Seattle, Washington, Google Cloud Storage, provided by Google, Inc. of Mountain View, California, and the like.
  • Amazon Simple Storage Service (“Amazon S3”
  • a multimodal sensor empowered awareness system for target recovery and object path prediction provides for a fast recovery of lost targets by empowering an autonomous robotic system (ARS) awareness engine with multimodal sensors.
  • ARS autonomous robotic system
  • the system tracks targets visually using a combination of visual and acoustic tracking sensors.
  • the system employs a main tracking sensor (e.g., optical video) that feeds into a spatiotemoral engine on the ARS.
  • Proximal sensors complement the main tracking sensor.
  • Using non- visual fast processing sensors that give rough directionality of the signal allows for prioritization of the visual target scanning process.
  • the sensors may include sound, RF, LIDAR, RADAR, GPS, and potentially other proximity sensors that do not isolate the location of a possible target, but provide a general direction to be scanned as a priority.
  • the system may thus implement multi-object path and collision prediction.
  • Figure 20 illustrates an embodiment of drone operation logic.
  • the drone operation logic 2000 illustrated in Figure 20 may be utilized to implement a system operating a multimodal sensor empowered awareness engine 2100 and a multimodal sensor empowered awareness engine 2200 as illustrated in Figure 21 and Figure 22, respectively.
  • drone operation logic 2000 comprises a main controller 2004 that controls and coordinates the operation of other components as well as providing general computational capabilities (e.g., to execute image processing 2018).
  • the main controller 2004 may comprise a central processing unit and/or one or more controllers or combinations of these components.
  • the drone operation logic 2000 will typically comprise memory 2008 which may be utilized by the main controller 2004 and other components (e.g., the DSP 2026 and/or the GPU 2022) to read and write instructions (commands) and data (operands for the instructions).
  • At least one camera 2016 may interface to image processing 2018 logic to record images and video from the environment.
  • the image processing 2018 may operate to provide image/video enhancement, compression, feature extraction, and other transformations, and provide these to the main controller 2004 for further processing and storage to memory 2008.
  • the image processing 2018 may further utilize a navigation board 2002 and/or DSP 2026 toward these ends. Images and video stored in the memory 2008 may also be read and processed by the main controller 2004, DSP 2026, and/or the GPU 2022.
  • the drone operation logic 2000 may operate on power received from a battery 2014.
  • the battery 2014 capability, charging, and energy supply may be managed by a power manager 2010.
  • the drone operation logic 2000 may transmit wireless signals of various types and range (e.g., cellular, WiFi, Bluetooth, and near field communication i.e. NFC) using the wireless communication logic 2020 and/or other transducers 2024.
  • the drone operation logic 2000 may also receive these types of wireless signals.
  • Wireless signals are transmitted and received using one or more antenna.
  • Other forms of electromagnetic radiation may be used to interact with proximate devices, such as infrared (not illustrated).
  • the drone operation logic 2000 may include a navigation board 2002 which includes a motor control 2006 using flight code (to operate propellers and/or landing gear), an altimeter 2028, a gyroscope 2030, and local memory 2012.
  • a system operating a multimodal sensor empowered awareness engine 2100 comprises a short range and long range sensors 2102, a sensor control systems 2104, a pixel to vector pipeline 2110, a detection/localization engine 2108, and a mitigation system 2106.
  • the detection localization engine 2108 comprises an object path predictor 2118, high resolution tracker 2112, and a low resolution tracker 2114.
  • the system operating a multimodal sensor empowered awareness engine 2100 may be operated in accordance with the process described in Figure 22.
  • a multimodal sensor empowered awareness engine 2200 detects that a lock on the tracked target has been lost (block 2202).
  • the multimodal sensor empowered awareness engine 2200 checks the proximal sensors to identify the lost target (block 2204).
  • the multimodal sensor empowered awareness engine 2200 ranks the probability of detecting the target based object path prediction (block 2206).
  • the multimodal sensor empowered awareness engine 2200 moves the camera towards a proximal sensor with the highest detection probability (block 2208).
  • a system operating a multimodal sensor empowered awareness engine 2300 illustrates a drone 2304 comprising a camera range 2302 as the range of the main tracking sensor and a first secondary sensor range 2306, second secondary sensor range 2308, a third secondary sensor range 2310, and a fourth secondary sensor range 2312, as the range of the complementary proximal sensors.
  • a system operating a multimodal sensor empowered awareness engine 2400 comprises a drone 2402 and an out of range target 2408 from a camera range 2404, in a secondary sensor range 2406.
  • An example aspect includes optimized video processing scheme scheduling.
  • processing schemes include localization algorithms that process video frames of one or more video data streams.
  • the processing schemes produce a prediction of objects within the video fame, as well as a confidence number that is a measure of quality for that prediction.
  • the confidence number can be normalized to a canonical range (0-1) and used to compare the confidence of each different processing scheme to track an item of interest.
  • Costs on all data inputs for the processing schemes can be pre-determined, estimated, or calculated.
  • a type of algorithm that operates on visual "features" can be dependent on the number of features in the video frame, and its cost estimated based on the number of features. As the number of features increases the cost can increase dramatically.
  • algorithms may differ in their effectiveness based on the type of input received. Robustness of each algorithm (e.g., processing scheme) can be compared by measuring a population of inputs that are selected as representative of sample conditions (e.g., real-life data sets) and prioritized by application requirements.
  • the example implementation can statically assign each processing scheme in a set of processing schemes to a group based on each processing scheme's determined cost and/or robustness estimations on the selected input group.
  • processing schemes can be grouped as:
  • a set of processing schemes are examined from group to group, and the process cycles through the processing schemes to produce a detection of an object.
  • an action is performed to stop or change a running group (i) and select or switch to group (ii).
  • a processing scheme from the group of (i) High cost (low refresh rate) high robustness algorithm can select a cheaper processing scheme from the group of (ii) Medium cost, medium robustness algorithms. Medium cost algorithms can be run from group (ii) in higher refresh rate, while monitoring the confidence level of the result.
  • a voting system that fuses estimations from high confidence results, and validates the result with an independent validation method. If the confidence is below low threshold, falls outside of a range, or the validation method fails, group (i) processing is performed. If the confidence is determined to be high, then group (iii) algorithms can be applied to optimize resources. Different Groups of algorithms (e.g., group (iii) and group (ii)) may be similar and selected based on different secondary factors to optimize local computing resources. For example, group (iii) algorithms may operate like group (ii) with a similar validation method but have secondary factors such as faster and more brittle performance.
  • processing schemes may have no confidence measure and depend solely on the validation method to determine a performance score or detect a failure. For example, if the confidence is low in group (iii) or the validation method fails, group (ii) processing is selected to be the active processing scheme for processing the video data.
  • the input can allow the system to invoke the group or processing scheme from the set of processing schemes that can detect the object, and constantly optimize the processing scheme that isolates the object from the background detected in the video data. For example, a processing scheme may more efficiently process video data to detect and track items from a white background.
  • Processing scheme examples for a group can include modified tiny You only look once (YOLO) Convolutional Neural Networks (CNN) on 448 pixels input tiles in the frame; a modified SSD (Single Shot Detection) CNN on 300 pixels multi-scale classification; a modified Faster R-CNN (Region Proposal Networks), segmentation analysis of the frame + classifier for each candidate, etc.
  • YOLO tiny You only look once
  • CNN Single Shot Detection
  • Processing scheme examples for a group (ii) & (iii) can include using SqueezeNet super fast rejection CNN as the validation method on the estimated location produced by the algorithms.
  • a 227 pixels input can be run in the location predicted by the algorithms and used as a part of the score for the result estimation.
  • Group (ii) Medium cost - medium robustness processing scheme examples can include: color-based tracking - hue, saturation, value (HSV) channel statistical modeling; feature based tracking - Oriented FAST and rotated BRIEF (ORB) features + descriptors, consensus of movement of keypoints (optical flow); movement based tracking - ego- motion compensation and background subtraction, etc.
  • Group (iii) Low-cost - low robustness processing scheme may have no confidence measure and depend solely on the validation method to determine a performance score and/or detect a failure.
  • Group (iii) Low-cost - low robustness processing scheme examples can include: extrapolation of location of object from past locations (e.g., no confidence measure (always max), depend on validation method to reject estimation, etc.); SqueezeNet super fast rejection CNN; template matching to the last known object appearance based on past detections, etc.
  • the foregoing algorithms are examples, and the present inventive concept is not limited thereto. Other example algorithms may be substituted therefor without departing from the inventive scope, as would be understood by those skilled in the art.
  • Logic refers to machine memory circuits, non-transitory machine readable media, and/or circuitry which by way of its material and/or material-energy configuration comprises control and/or procedural signals, and/or settings and values (such as resistance, impedance, capacitance, inductance, current/voltage ratings, etc.), that may be applied to influence the operation of a device.
  • Magnetic media, electronic circuits, electrical and optical memory are examples of logic.
  • Logic specifically excludes pure signals or software per se (however does not exclude machine memories comprising software and thereby forming configurations of matter).
  • Those skilled in the art will appreciate that logic may be distributed throughout one or more devices, and/or may be comprised of combinations of memory, media, processing circuits and controllers, other circuits, and so on. Therefore, in the interest of clarity and correctness, logic may not always be distinctly illustrated in drawings of devices and systems, although it is inherently present therein.
  • the implementer may opt for a hardware and/or firmware vehicle; alternatively, if flexibility is paramount, the implementer may opt for a solely software implementation; or, yet again alternatively, the implementer may opt for some combination of hardware, software, and/or firmware.
  • ASICs Application Specific Integrated Circuits
  • FPGAs Field Programmable Gate Arrays
  • DSPs digital signal processors
  • ASICs Application Specific Integrated Circuits
  • FPGAs Field Programmable Gate Arrays
  • DSPs digital signal processors
  • a signal bearing media include, but are not limited to, the following: recordable type media such as floppy disks, hard disk drives, CD ROMs, digital tape, flash drives, SD cards, solid state fixed or removable storage, and computer memory.
  • circuitry includes, but is not limited to, electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, circuitry forming a general purpose computing device configured by a computer program (e.g., a general purpose computer configured by a computer program which at least partially carries out processes and/or devices described herein, or a microprocessor configured by a computer program which at least partially carries out processes and/or devices described herein), circuitry forming a memory device (e.g., forms of random access memory), and/or circuitry forming a communications device (e.g., a modem, communications switch, or optical-electrical equipment).
  • a computer program e.g., a general purpose computer configured by a computer program which at least partially carries out processes and/or devices described herein, or a microprocessor configured by a computer program which at least partially carries out processes and/or devices described herein
  • circuitry forming a memory device e.g.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Astronomy & Astrophysics (AREA)
  • Electromagnetism (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Traffic Control Systems (AREA)

Abstract

Une opération de commande de vol d'un véhicule aérien de référence est effectuée. Par exemple, une image capturée par un capteur d'image du véhicule aérien de référence est reçue. Une cible est détectée dans l'image. Un emplacement relatif tridimensionnel de la cible par rapport au véhicule aérien de référence est déterminé sur la base de l'image. L'opération de commande de vol est effectuée sur la base de l'emplacement relatif tridimensionnel de la cible par rapport au véhicule aérien de référence.
PCT/US2018/053084 2017-10-01 2018-09-27 Commande de vol à l'aide d'une vision par ordinateur WO2019067695A1 (fr)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US201762566449P 2017-10-01 2017-10-01
US62/566,449 2017-10-01
US15/729,581 US10325169B2 (en) 2016-10-09 2017-10-10 Spatio-temporal awareness engine for priority tree based region selection across multiple input cameras and multimodal sensor empowered awareness engine for target recovery and object path prediction
US15/729,581 2017-10-10
US16/142,452 2018-09-26
US16/142,452 US10514711B2 (en) 2016-10-09 2018-09-26 Flight control using computer vision

Publications (1)

Publication Number Publication Date
WO2019067695A1 true WO2019067695A1 (fr) 2019-04-04

Family

ID=65903439

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/053084 WO2019067695A1 (fr) 2017-10-01 2018-09-27 Commande de vol à l'aide d'une vision par ordinateur

Country Status (1)

Country Link
WO (1) WO2019067695A1 (fr)

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110132049A (zh) * 2019-06-11 2019-08-16 南京森林警察学院 一种基于无人机平台的自动瞄准式狙击步枪
RU2717047C1 (ru) * 2019-08-19 2020-03-18 Федеральное государственное автономное образовательное учреждение высшего образования "Национальный исследовательский Томский государственный университет" (ТГУ, НИ ТГУ) Комплекс распределенного управления интеллектуальными роботами для борьбы с малогабаритными беспилотными летательными аппаратами
RU2722521C1 (ru) * 2019-09-13 2020-06-01 Общество с ограниченной ответственностью "СТИЛСОФТ" Способ точной посадки беспилотного летательного аппарата на посадочную платформу
EP3751302A1 (fr) * 2019-06-13 2020-12-16 The Boeing Company Procédés et systèmes de perception de machine acoustique pour un aéronef
WO2021198634A1 (fr) * 2020-03-31 2021-10-07 Sony Group Corporation Dispositif, programme informatique et procédé de surveillance d'un uav
CN113784050A (zh) * 2021-09-17 2021-12-10 深圳市道通智能航空技术股份有限公司 一种图像获取方法、装置、飞行器和存储介质
CN114199205A (zh) * 2021-11-16 2022-03-18 河北大学 基于改进四叉树orb算法的双目测距方法
US11403069B2 (en) 2017-07-24 2022-08-02 Tesla, Inc. Accelerated mathematical engine
US11409692B2 (en) 2017-07-24 2022-08-09 Tesla, Inc. Vector computational unit
US11487288B2 (en) 2017-03-23 2022-11-01 Tesla, Inc. Data synthesis for autonomous control systems
US11537811B2 (en) 2018-12-04 2022-12-27 Tesla, Inc. Enhanced object detection for autonomous vehicles based on field view
US11561791B2 (en) 2018-02-01 2023-01-24 Tesla, Inc. Vector computational unit receiving data elements in parallel from a last row of a computational array
US11562231B2 (en) 2018-09-03 2023-01-24 Tesla, Inc. Neural networks for embedded devices
US11567514B2 (en) 2019-02-11 2023-01-31 Tesla, Inc. Autonomous and user controlled vehicle summon to a target
US11610117B2 (en) 2018-12-27 2023-03-21 Tesla, Inc. System and method for adapting a neural network model on a hardware platform
US11636333B2 (en) 2018-07-26 2023-04-25 Tesla, Inc. Optimizing neural network structures for embedded systems
US11665108B2 (en) 2018-10-25 2023-05-30 Tesla, Inc. QoS manager for system on a chip communications
US11667376B1 (en) * 2021-11-12 2023-06-06 Beta Air, Llc System and method for flight control compensation for component degradation
US11681649B2 (en) 2017-07-24 2023-06-20 Tesla, Inc. Computational array microprocessor system using non-consecutive data formatting
US20230222809A1 (en) * 2022-01-12 2023-07-13 Mazen A. Al-Sinan Autonomous low-altitude uav detection system
US11734562B2 (en) 2018-06-20 2023-08-22 Tesla, Inc. Data pipeline and deep learning system for autonomous driving
US11748620B2 (en) 2019-02-01 2023-09-05 Tesla, Inc. Generating ground truth for machine learning from time series elements
US11790664B2 (en) 2019-02-19 2023-10-17 Tesla, Inc. Estimating object properties using visual image data
US11816585B2 (en) 2018-12-03 2023-11-14 Tesla, Inc. Machine learning models operating at different frequencies for autonomous vehicles
US11841434B2 (en) 2018-07-20 2023-12-12 Tesla, Inc. Annotation cross-labeling for autonomous control systems
US11893393B2 (en) 2017-07-24 2024-02-06 Tesla, Inc. Computational array microprocessor system with hardware arbiter managing memory requests
US11893774B2 (en) 2018-10-11 2024-02-06 Tesla, Inc. Systems and methods for training machine models with augmented data
US12014553B2 (en) 2019-02-01 2024-06-18 Tesla, Inc. Predicting three-dimensional features for autonomous driving

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160068267A1 (en) * 2014-09-05 2016-03-10 SZ DJI Technology Co., Ltd Context-based flight mode selection
US20160202695A1 (en) * 2014-09-12 2016-07-14 4D Tech Solutions, Inc. Unmanned aerial vehicle 3d mapping system
US20160246297A1 (en) * 2015-02-24 2016-08-25 Siemens Corporation Cloud-based control system for unmanned aerial vehicles
US20170161907A1 (en) * 2008-12-19 2017-06-08 Reconrobotics, Inc. System and method for determining an orientation and position of an object

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170161907A1 (en) * 2008-12-19 2017-06-08 Reconrobotics, Inc. System and method for determining an orientation and position of an object
US20160068267A1 (en) * 2014-09-05 2016-03-10 SZ DJI Technology Co., Ltd Context-based flight mode selection
US20160202695A1 (en) * 2014-09-12 2016-07-14 4D Tech Solutions, Inc. Unmanned aerial vehicle 3d mapping system
US20160246297A1 (en) * 2015-02-24 2016-08-25 Siemens Corporation Cloud-based control system for unmanned aerial vehicles

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12020476B2 (en) 2017-03-23 2024-06-25 Tesla, Inc. Data synthesis for autonomous control systems
US11487288B2 (en) 2017-03-23 2022-11-01 Tesla, Inc. Data synthesis for autonomous control systems
US11403069B2 (en) 2017-07-24 2022-08-02 Tesla, Inc. Accelerated mathematical engine
US11681649B2 (en) 2017-07-24 2023-06-20 Tesla, Inc. Computational array microprocessor system using non-consecutive data formatting
US11893393B2 (en) 2017-07-24 2024-02-06 Tesla, Inc. Computational array microprocessor system with hardware arbiter managing memory requests
US11409692B2 (en) 2017-07-24 2022-08-09 Tesla, Inc. Vector computational unit
US11797304B2 (en) 2018-02-01 2023-10-24 Tesla, Inc. Instruction set architecture for a vector computational unit
US11561791B2 (en) 2018-02-01 2023-01-24 Tesla, Inc. Vector computational unit receiving data elements in parallel from a last row of a computational array
US11734562B2 (en) 2018-06-20 2023-08-22 Tesla, Inc. Data pipeline and deep learning system for autonomous driving
US11841434B2 (en) 2018-07-20 2023-12-12 Tesla, Inc. Annotation cross-labeling for autonomous control systems
US11636333B2 (en) 2018-07-26 2023-04-25 Tesla, Inc. Optimizing neural network structures for embedded systems
US11562231B2 (en) 2018-09-03 2023-01-24 Tesla, Inc. Neural networks for embedded devices
US11983630B2 (en) 2018-09-03 2024-05-14 Tesla, Inc. Neural networks for embedded devices
US11893774B2 (en) 2018-10-11 2024-02-06 Tesla, Inc. Systems and methods for training machine models with augmented data
US11665108B2 (en) 2018-10-25 2023-05-30 Tesla, Inc. QoS manager for system on a chip communications
US11816585B2 (en) 2018-12-03 2023-11-14 Tesla, Inc. Machine learning models operating at different frequencies for autonomous vehicles
US11537811B2 (en) 2018-12-04 2022-12-27 Tesla, Inc. Enhanced object detection for autonomous vehicles based on field view
US11908171B2 (en) 2018-12-04 2024-02-20 Tesla, Inc. Enhanced object detection for autonomous vehicles based on field view
US11610117B2 (en) 2018-12-27 2023-03-21 Tesla, Inc. System and method for adapting a neural network model on a hardware platform
US11748620B2 (en) 2019-02-01 2023-09-05 Tesla, Inc. Generating ground truth for machine learning from time series elements
US12014553B2 (en) 2019-02-01 2024-06-18 Tesla, Inc. Predicting three-dimensional features for autonomous driving
US11567514B2 (en) 2019-02-11 2023-01-31 Tesla, Inc. Autonomous and user controlled vehicle summon to a target
US11790664B2 (en) 2019-02-19 2023-10-17 Tesla, Inc. Estimating object properties using visual image data
CN110132049A (zh) * 2019-06-11 2019-08-16 南京森林警察学院 一种基于无人机平台的自动瞄准式狙击步枪
US11531100B2 (en) 2019-06-13 2022-12-20 The Boeing Company Methods and systems for acoustic machine perception for an aircraft
EP3751302A1 (fr) * 2019-06-13 2020-12-16 The Boeing Company Procédés et systèmes de perception de machine acoustique pour un aéronef
RU2717047C1 (ru) * 2019-08-19 2020-03-18 Федеральное государственное автономное образовательное учреждение высшего образования "Национальный исследовательский Томский государственный университет" (ТГУ, НИ ТГУ) Комплекс распределенного управления интеллектуальными роботами для борьбы с малогабаритными беспилотными летательными аппаратами
RU2722521C1 (ru) * 2019-09-13 2020-06-01 Общество с ограниченной ответственностью "СТИЛСОФТ" Способ точной посадки беспилотного летательного аппарата на посадочную платформу
CN115335886A (zh) * 2020-03-31 2022-11-11 索尼集团公司 用于监视uav的设备、计算机程序以及方法
WO2021198634A1 (fr) * 2020-03-31 2021-10-07 Sony Group Corporation Dispositif, programme informatique et procédé de surveillance d'un uav
CN113784050A (zh) * 2021-09-17 2021-12-10 深圳市道通智能航空技术股份有限公司 一种图像获取方法、装置、飞行器和存储介质
CN113784050B (zh) * 2021-09-17 2023-12-12 深圳市道通智能航空技术股份有限公司 一种图像获取方法、装置、飞行器和存储介质
US11667376B1 (en) * 2021-11-12 2023-06-06 Beta Air, Llc System and method for flight control compensation for component degradation
CN114199205B (zh) * 2021-11-16 2023-09-05 河北大学 基于改进四叉树orb算法的双目测距方法
CN114199205A (zh) * 2021-11-16 2022-03-18 河北大学 基于改进四叉树orb算法的双目测距方法
US20230222809A1 (en) * 2022-01-12 2023-07-13 Mazen A. Al-Sinan Autonomous low-altitude uav detection system

Similar Documents

Publication Publication Date Title
US10514711B2 (en) Flight control using computer vision
WO2019067695A1 (fr) Commande de vol à l'aide d'une vision par ordinateur
US20200158822A1 (en) Unmanned aerial vehicle radar detection
Ganti et al. Implementation of detection and tracking mechanism for small UAS
US10408936B2 (en) LIDAR light fence to cue long range LIDAR of target drone
US20200162489A1 (en) Security event detection and threat assessment
Khan et al. On the detection of unauthorized drones—Techniques and future perspectives: A review
Ma'Sum et al. Simulation of intelligent unmanned aerial vehicle (UAV) for military surveillance
Vrba et al. Onboard marker-less detection and localization of non-cooperating drones for their safe interception by an autonomous aerial system
US10325169B2 (en) Spatio-temporal awareness engine for priority tree based region selection across multiple input cameras and multimodal sensor empowered awareness engine for target recovery and object path prediction
KR101948569B1 (ko) 라이다 센서 및 팬틸트줌 카메라를 활용한 비행체 식별 시스템 및 그 제어 방법
Chun et al. Robot surveillance and security
US8965044B1 (en) Rotorcraft threat detection system
Cho et al. Vision-based detection and tracking of airborne obstacles in a cluttered environment
KR102477584B1 (ko) 무인항공기 감시 방법 및 장치
RU2755603C2 (ru) Система и способ обнаружения и противодействия беспилотным летательным аппаратам
US20210295530A1 (en) Moving Object Detection System
AU2021236537A1 (en) Object tracking system including stereo camera assembly and methods of use
Elsayed et al. Review on real-time drone detection based on visual band electro-optical (EO) sensor
Yasmine et al. Survey on current anti-drone systems: process, technologies, and algorithms
Dogru et al. Tracking drones with drones using millimeter wave radar
Barisic et al. Brain over Brawn: Using a Stereo Camera to Detect, Track, and Intercept a Faster UAV by Reconstructing the Intruder's Trajectory
Dolph et al. Detection and Tracking of Aircraft from Small Unmanned Aerial Systems
CN112580421A (zh) 用于探测无人驾驶飞行器的***和方法
CN112580420A (zh) 用于抵制无人驾驶飞行器的***和方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18863622

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18863622

Country of ref document: EP

Kind code of ref document: A1