WO2024118016A1 - A method for day and night unmanned aerial vehicle detection for unmanned ground vehicles - Google Patents

A method for day and night unmanned aerial vehicle detection for unmanned ground vehicles Download PDF

Info

Publication number
WO2024118016A1
WO2024118016A1 PCT/TR2023/050972 TR2023050972W WO2024118016A1 WO 2024118016 A1 WO2024118016 A1 WO 2024118016A1 TR 2023050972 W TR2023050972 W TR 2023050972W WO 2024118016 A1 WO2024118016 A1 WO 2024118016A1
Authority
WO
WIPO (PCT)
Prior art keywords
unmanned aerial
aerial vehicle
unmanned
surveillance camera
vehicles
Prior art date
Application number
PCT/TR2023/050972
Other languages
French (fr)
Inventor
Halil AKBULUT
Baris Yalcin
Atakan AKBULUT
Salih AKARSU
Gurkan CETIN
Mehmet Onur OZCELIK
Muhittin SOLMAZ
Original Assignee
Havelsan Hava Elektronik San. Ve Tic. A.S.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Havelsan Hava Elektronik San. Ve Tic. A.S. filed Critical Havelsan Hava Elektronik San. Ve Tic. A.S.
Publication of WO2024118016A1 publication Critical patent/WO2024118016A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks

Definitions

  • the invention relates to a method of detecting flying unmanned aerial vehicles by using RGB or thermal images obtained with the surveillance camera in unmanned ground vehicles and scanning 360 degrees.
  • the surveillance camera used in today's unmanned ground vehicles provides only RGB or thermal images to the user via the ground control station. With this image, it is up to the operator to accurately detect an unmanned aerial vehicle. Accurate detection by the operator depends on his/her experience, attention, observation skills and familiarity with many unmanned aerial vehicles.
  • the operator of the surveillance camera must also have sufficient knowledge and experience in terms of camera control. In order to perform the scanning of the elements in the environment with the camera, the operator must use a manual control to scan the environment in 360 degrees. In addition, the tracking of the unmanned aerial vehicle detected by the operator or desired to be tracked by the operator is always carried out by the operator with a manual remote control.
  • the Chinese Patent numbered CN113298053 A mentions a multi-target unmanned aerial vehicle tracking identification method and apparatus, an electronic device and a storage medium.
  • the document describes the use of Yolo V3 and DeepSort algorithms as deep learning algorithms. It also mentions the acquisition of binocular camera images at certain intervals.
  • the Chinese document CN112465854A describes a unmanned aerial vehicle tracking method based on a non-anchor point detection algorithm.
  • the United States Patent US2021188435A1 is a system for neutralizing a target unmanned aerial vehicle, an unmanned aerial vehicle detection system comprising at least one detection sensor that can be deployed to detect a plurality of counter-attack unmanned aerial vehicles and a target unmanned aerial vehicle in flight is mentioned.
  • the object of the present invention is to realize a method of detecting flying unmanned aerial vehicles by using RGB or thermal images obtained with the surveillance camera in unmanned ground vehicles and scanning 360 degrees.
  • Another object of the present invention is to provide a method of performing 360- degree area scanning and camera zooming operations more effectively and quickly with a surveillance camera.
  • the invention relates to a method of detecting flying unmanned aerial vehicles by using RGB or thermal images obtained with the surveillance camera in unmanned ground vehicles and performing 360-degree scanning and comprising the steps; scanning the perimeter of the perimeter surveillance camera in 360 degrees and in x, y and z axes in certain periods and obtaining images, creating a data set by sending the RGB or thermal image obtained by the surveillance camera during the scanning process to the artificial intelligence computer instantaneously, labeling the unmanned aerial vehicles in the dataset and creating pixel coordinates of each unmanned aerial vehicle in the dataset, then using YoloX 1 , Yolov5 2 , MMDetection 3 or Detectron2 4 algorithms to detect the unmanned aerial vehicle, after detecting the unmanned aerial vehicle, the images are divided into grids and object detection is performed on the divided grids, sending the detected unmanned aerial vehicle information from the artificial intelligence computer to the communication module, sending the incoming detection information wirelessly from the communication module to the ground control station, enabling the user to view information about the detected unmanned aerial vehicles via the ground control station, marking
  • the surveillance camera scans the surroundings in x, y and z axes at 360 degrees and at certain periods with the help of an artificial intelligence computer.
  • the surveillance camera will scan the surroundings in x, y and z axes.
  • the surveillance camera When the surveillance camera is first turned on, it will start at the top left corner initial reference angle. While performing the scanning process, it will start scanning the x-axis by rotating at a specified angle from a certain time period. For example; after starting at 0, 90 and reaching 359, 90, it will decrease the y axis by 5 degrees from 0, 85 to 359, 85. In this way, the scanning process will be completed for one period until it reaches 359, 40 degrees.
  • the height of the surveillance camera that is, the z-axis
  • the height of the surveillance camera can be changed manually by the user. In the same way, the height can be decreased or increased after certain scanning periods with the surveillance algorithm. This process will continue continuously until the unmanned aerial vehicle is detected.
  • the 360-degree perimeter surveillance algorithm will stop the scanning process. If the unmanned aerial vehicle leaves the perimeter surveillance camera's field of view, the 360-degree perimeter surveillance algorithm will start working again.
  • the surveillance camera sends RGB or thermal images to the artificial intelligence computer.
  • a data set is created from many different unmanned aerial vehicle images as RGB and thermal images.
  • the unmanned aerial vehicles in this data set are labeled and the pixel coordinates of each unmanned aerial vehicle in the data set are created.
  • YoloX 1 , Yolov5 2 , MMDetection 3 or Detectron2 4 algorithms which are among the most widely used deep learning-based object detection algorithms in the literature, are used to train on the unmanned aerial vehicle data set and the algorithm that gives the best result is decided. Deep learning based unmanned aerial vehicle detection is realized with the decided algorithm.
  • the deep learning-based unmanned aerial vehicle detection algorithm in the artificial intelligence computer detects the unmanned aerial vehicle from the incoming RGB or thermal camera image.
  • the high-resolution images are divided into grids of certain dimensions, for example 512x512, and the object detection process is performed over the divided grids.
  • the pixel coordinate information of the detected objects is scaled and applied to the high-resolution image again. In this way, feature loss in the detection of small objects is prevented and the object detection process is performed successfully.
  • Unmanned aerial vehicle detection information is sent from the artificial intelligence computer to the communication module.
  • the information received by the communication module is sent wirelessly to the ground control station.
  • the operator accesses the information of the detected unmanned aerial vehicles via the ground control station.
  • the operator marks the unmanned aerial vehicle to be tracked via the ground control station and the information of the marked unmanned aerial vehicle is transmitted to the artificial intelligence computer via the communication module.
  • the artificial intelligence computer receives the information about the unmanned aerial vehicle to be tracked and performs the necessary tracking process with the help of the Lucas-Kanade algorithm 5 and transmits the relevant motion signals to the surveillance camera and the camera is controlled accordingly.
  • Unmanned aerial vehicle detection in the surveillance camera used in unmanned land vehicles is realized faster and more reliably thanks to the method of the invention. In this way, the knowledge and labor force etc. required for the userdependent detection process will be overcome.
  • the 360-degree area scanning and camera zooming operations which are performed manually with a perimeter surveillance camera, are performed more effectively and quickly by the 360-degree perimeter surveillance algorithm. Thanks to the algorithm used, the problem of slow scanning that may occur with the manual procedure and the problem of not scanning every place with the camera in a certain period of time will be overcome.
  • the information outputs of the unmanned aerial vehicles are displayed to the user via the ground control station.
  • the unmanned aerial vehicle to be tracked by the camera will be selected by the operator and the tracking process will be performed autonomously.
  • the method of the invention which is intended to be used in unmanned land vehicles, can additionally be integrated into a perimeter surveillance camera or manned or unmanned land, air or sea elements containing a gimbal and can be used for the detection of unmanned aerial vehicles. It is also possible to use it in perimeter surveillance cameras located permanently in a settlement.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Remote Sensing (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a method of detecting flying unmanned aerial vehicles by using RGB or thermal images obtained with the surveillance camera in unmanned ground vehicles and scanning 360 degrees.

Description

A METHOD FOR DAY AND NIGHT UNMANNED AERIAL VEHICLE DETECTION FOR UNMANNED GROUND VEHICLES
Technical Field
The invention relates to a method of detecting flying unmanned aerial vehicles by using RGB or thermal images obtained with the surveillance camera in unmanned ground vehicles and scanning 360 degrees.
Prior Art
The surveillance camera used in today's unmanned ground vehicles provides only RGB or thermal images to the user via the ground control station. With this image, it is up to the operator to accurately detect an unmanned aerial vehicle. Accurate detection by the operator depends on his/her experience, attention, observation skills and familiarity with many unmanned aerial vehicles.
The operator of the surveillance camera must also have sufficient knowledge and experience in terms of camera control. In order to perform the scanning of the elements in the environment with the camera, the operator must use a manual control to scan the environment in 360 degrees. In addition, the tracking of the unmanned aerial vehicle detected by the operator or desired to be tracked by the operator is always carried out by the operator with a manual remote control.
In the known state of the art, the Chinese Patent numbered CN113298053 A mentions a multi-target unmanned aerial vehicle tracking identification method and apparatus, an electronic device and a storage medium. The document describes the use of Yolo V3 and DeepSort algorithms as deep learning algorithms. It also mentions the acquisition of binocular camera images at certain intervals.
In the known state of the art, Chinese Patent numbered CN113822153 A mentions an unmanned aerial vehicle tracking method based on an improved DeepSort algorithm. The document mentions that the DeepSort algorithm is used in the detection and tracking of unmanned aerial vehicles.
In the known state of the art, the Chinese document CN112465854A describes a unmanned aerial vehicle tracking method based on a non-anchor point detection algorithm. The document mentions DeepSort, Yolo and R-CNN algorithms.
In the know state of the art, the United States Patent US2021188435A1 is a system for neutralizing a target unmanned aerial vehicle, an unmanned aerial vehicle detection system comprising at least one detection sensor that can be deployed to detect a plurality of counter-attack unmanned aerial vehicles and a target unmanned aerial vehicle in flight is mentioned.
When the systems available in the art were examined, there was a need to realize a method in which flying unmanned aerial vehicles are detected by using RGB or thermal images obtained with the surveillance camera on unmanned ground vehicles and scanning 360 degrees.
Objectives of the Invention
The object of the present invention is to realize a method of detecting flying unmanned aerial vehicles by using RGB or thermal images obtained with the surveillance camera in unmanned ground vehicles and scanning 360 degrees.
Another object of the present invention is to provide a method of performing 360- degree area scanning and camera zooming operations more effectively and quickly with a surveillance camera.
Detailed description of the invention
The invention relates to a method of detecting flying unmanned aerial vehicles by using RGB or thermal images obtained with the surveillance camera in unmanned ground vehicles and performing 360-degree scanning and comprising the steps; scanning the perimeter of the perimeter surveillance camera in 360 degrees and in x, y and z axes in certain periods and obtaining images, creating a data set by sending the RGB or thermal image obtained by the surveillance camera during the scanning process to the artificial intelligence computer instantaneously, labeling the unmanned aerial vehicles in the dataset and creating pixel coordinates of each unmanned aerial vehicle in the dataset, then using YoloX1, Yolov52, MMDetection3 or Detectron24 algorithms to detect the unmanned aerial vehicle, after detecting the unmanned aerial vehicle, the images are divided into grids and object detection is performed on the divided grids, sending the detected unmanned aerial vehicle information from the artificial intelligence computer to the communication module, sending the incoming detection information wirelessly from the communication module to the ground control station, enabling the user to view information about the detected unmanned aerial vehicles via the ground control station, marking the unmanned aerial vehicle that the user wants to track from the unmanned aerial vehicles detected through the ground control station on the screen,
- transmitting the information of the marked unmanned aerial vehicle to the artificial intelligence computer via the communication module,
- tracking and direction finding of the unmanned aerial vehicle with the Lucas-Kanade algorithm 5 in the artificial intelligence computer.
In the method of the invention, the surveillance camera scans the surroundings in x, y and z axes at 360 degrees and at certain periods with the help of an artificial intelligence computer. With the help of the surveillance algorithm, the surveillance camera will scan the surroundings in x, y and z axes. When the surveillance camera is first turned on, it will start at the top left corner initial reference angle. While performing the scanning process, it will start scanning the x-axis by rotating at a specified angle from a certain time period. For example; after starting at 0, 90 and reaching 359, 90, it will decrease the y axis by 5 degrees from 0, 85 to 359, 85. In this way, the scanning process will be completed for one period until it reaches 359, 40 degrees. The height of the surveillance camera, that is, the z-axis, can be changed manually by the user. In the same way, the height can be decreased or increased after certain scanning periods with the surveillance algorithm. This process will continue continuously until the unmanned aerial vehicle is detected. When one or more unmanned aerial vehicles are detected, the 360-degree perimeter surveillance algorithm will stop the scanning process. If the unmanned aerial vehicle leaves the perimeter surveillance camera's field of view, the 360-degree perimeter surveillance algorithm will start working again.
During this scanning process, the surveillance camera sends RGB or thermal images to the artificial intelligence computer.
In the artificial intelligence computer, a data set is created from many different unmanned aerial vehicle images as RGB and thermal images. The unmanned aerial vehicles in this data set are labeled and the pixel coordinates of each unmanned aerial vehicle in the data set are created. YoloX1, Yolov52, MMDetection3 or Detectron24 algorithms, which are among the most widely used deep learning-based object detection algorithms in the literature, are used to train on the unmanned aerial vehicle data set and the algorithm that gives the best result is decided. Deep learning based unmanned aerial vehicle detection is realized with the decided algorithm.
The deep learning-based unmanned aerial vehicle detection algorithm in the artificial intelligence computer detects the unmanned aerial vehicle from the incoming RGB or thermal camera image.
When object detection is performed with algorithms such as YoloX1, Yolov52, MMDetection3 or Detectron24 in high resolution images such as 4K (4096x2160) resolution images, it cannot find small objects. The reason for this is that since the input resolution size for the object detection algorithm is, for example, 512x512, the 4K (4096x2160) image must be scaled to 512x512, which causes feature loss in the image. Small objects cannot be detected accurately because the image is scaled to a smaller size.
In the method of the invention, the high-resolution images are divided into grids of certain dimensions, for example 512x512, and the object detection process is performed over the divided grids. The pixel coordinate information of the detected objects is scaled and applied to the high-resolution image again. In this way, feature loss in the detection of small objects is prevented and the object detection process is performed successfully.
Unmanned aerial vehicle detection information is sent from the artificial intelligence computer to the communication module. The information received by the communication module is sent wirelessly to the ground control station. The operator accesses the information of the detected unmanned aerial vehicles via the ground control station. The operator marks the unmanned aerial vehicle to be tracked via the ground control station and the information of the marked unmanned aerial vehicle is transmitted to the artificial intelligence computer via the communication module. The artificial intelligence computer receives the information about the unmanned aerial vehicle to be tracked and performs the necessary tracking process with the help of the Lucas-Kanade algorithm5 and transmits the relevant motion signals to the surveillance camera and the camera is controlled accordingly.
Unmanned aerial vehicle detection in the surveillance camera used in unmanned land vehicles is realized faster and more reliably thanks to the method of the invention. In this way, the knowledge and labor force etc. required for the userdependent detection process will be overcome.
The 360-degree area scanning and camera zooming operations, which are performed manually with a perimeter surveillance camera, are performed more effectively and quickly by the 360-degree perimeter surveillance algorithm. Thanks to the algorithm used, the problem of slow scanning that may occur with the manual procedure and the problem of not scanning every place with the camera in a certain period of time will be overcome.
The information outputs of the unmanned aerial vehicles are displayed to the user via the ground control station. The unmanned aerial vehicle to be tracked by the camera will be selected by the operator and the tracking process will be performed autonomously.
The method of the invention, which is intended to be used in unmanned land vehicles, can additionally be integrated into a perimeter surveillance camera or manned or unmanned land, air or sea elements containing a gimbal and can be used for the detection of unmanned aerial vehicles. It is also possible to use it in perimeter surveillance cameras located permanently in a settlement.
References:
[1] Zheng Ge, Songtao Liu, Feng Wang, Zeming Li, Jian Sun. YOLOX: Exceeding YOLO Series in 2021. Megvii Technology. arXiv:2107.08430v2 [cs.CV] 6 Aug 2021
[2] Xingkui Zhu, Shuchang Lyu, Xu Wang, Qi Zhao. TPH-YOLOv5: Improved YOLOv5 Based on Transformer Prediction Head for Object Detection on unmanned aerial vehicle-captured Scenarios. Beihang University, Beijing, China
[3] Kai Chen, Jiaqi Wang, Jiangmiao Pang, Yuhang Cao, Yu Xiong, Xiaoxiao Li, Shuyang Sun, Wansen Feng, Ziwei Liu, Jiarui Xu, Zheng Zhang, Dazhi Cheng, Chenchen Zhu, Tianheng Cheng, Qijie Zhao, Buyu Li, Xin Lu, Rui Zhu, Yue Wu, Jifeng Dai, Jingdong Wang, Jianping Shi, Wanli Ouyang, Chen Change Loy, Dahua Lin. MMDetection: Open MMLab Detection Toolbox and Benchmark. arXiv: 1906.07155vl [cs.CV] 17 Jun 2019
[4]https://web. archive, org/web/20221104115756/https://gi thub.com/facebookrese arch/detectron2 [5] Dhara Patel, Saurabh Upadhyay. Optical Flow Measurement using Lucas kanade Method. International Journal of Computer Applications (0975 - 8887)
Volume 61- No. 10, January 2013

Claims

1. The invention relates to a method of detecting flying unmanned aerial vehicles by using RGB or thermal images obtained with the surveillance camera on unmanned ground vehicles and performing 360-degree scanning and comprising the steps;
- the perimeter surveillance camera scans the environment in 360 degrees and x, y and z axes to obtain images,
Creating a data set by sending the RGB or thermal image obtained by the surveillance camera during the scanning process to the artificial intelligence computer instantaneously, labeling the unmanned aerial vehicles in the dataset and creating pixel coordinates of each unmanned aerial vehicle in the dataset,
- then realizing the detection of the unmanned aerial vehicle using the Yolov52 algorithm, after the detection of the unmanned aerial vehicle, the images are divided into grids and the object detection process is performed on the divided grids, sending the detected unmanned aerial vehicle information from the artificial intelligence computer to the communication module, sending the incoming detection information wirelessly from the communication module to the ground control station,
- the user can display information about the unmanned aerial vehicles detected via the ground control station, marking the unmanned aerial vehicle that the user wants to track from the unmanned aerial vehicles detected through the ground control station on the screen,
- transmitting the information of the marked unmanned aerial vehicle to the artificial intelligence computer via the communication module, realization of tracking and direction finding of the unmanned aerial vehicle with the Lucas-Kanade algorithm 5 in the artificial intelligence computer. The subject invention relates to a method for detecting flying unmanned aerial vehicles by using RGB or thermal images obtained by a surveillance camera on an unmanned ground vehicle as in claim 1 and performing a 360- degree scan, characterized in that YoloX1, MMDetection3 or Detectron24 algorithms can also be used to detect the unmanned aerial vehicle.
PCT/TR2023/050972 2022-11-30 2023-09-18 A method for day and night unmanned aerial vehicle detection for unmanned ground vehicles WO2024118016A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TR2022018222 2022-11-30
TR2022/018222 2022-11-30

Publications (1)

Publication Number Publication Date
WO2024118016A1 true WO2024118016A1 (en) 2024-06-06

Family

ID=91324602

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/TR2023/050972 WO2024118016A1 (en) 2022-11-30 2023-09-18 A method for day and night unmanned aerial vehicle detection for unmanned ground vehicles

Country Status (1)

Country Link
WO (1) WO2024118016A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180338117A1 (en) * 2017-05-16 2018-11-22 Telme Electronics Inc. Surround camera system for autonomous driving
CN111161305A (en) * 2019-12-18 2020-05-15 任子行网络技术股份有限公司 Intelligent unmanned aerial vehicle identification tracking method and system
CN113589848A (en) * 2021-09-28 2021-11-02 西湖大学 Multi-unmanned aerial vehicle detection, positioning and tracking system and method based on machine vision

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180338117A1 (en) * 2017-05-16 2018-11-22 Telme Electronics Inc. Surround camera system for autonomous driving
CN111161305A (en) * 2019-12-18 2020-05-15 任子行网络技术股份有限公司 Intelligent unmanned aerial vehicle identification tracking method and system
CN113589848A (en) * 2021-09-28 2021-11-02 西湖大学 Multi-unmanned aerial vehicle detection, positioning and tracking system and method based on machine vision

Similar Documents

Publication Publication Date Title
CA2767312C (en) Automatic video surveillance system and method
EP3627269A1 (en) Target tracking method and apparatus, mobile device and storage medium
US20160138919A1 (en) Geodetic surveying system
US6362875B1 (en) Machine vision system and method for inspection, homing, guidance and docking with respect to remote objects
WO2017096548A1 (en) Systems and methods for auto-return
US7773116B1 (en) Digital imaging stabilization
EP3740785B1 (en) Automatic camera driven aircraft control for radar activation
CN111966133A (en) Visual servo control system of holder
CN104792313B (en) The mapping control method of unmanned Reconnaissance system, apparatus and system
CN111192318B (en) Method and device for determining position and flight direction of unmanned aerial vehicle and unmanned aerial vehicle
CN109974713B (en) Navigation method and system based on surface feature group
CN108776491A (en) Unmanned plane multiple target monitoring system and monitoring method based on dynamic image identification
KR20170089574A (en) System for managing obstacle of ship and method for managing obstacle
JP2014063411A (en) Remote control system, control method, and program
JP2016118994A (en) Monitoring system
WO2024118016A1 (en) A method for day and night unmanned aerial vehicle detection for unmanned ground vehicles
US10778899B2 (en) Camera control apparatus
CN116817929B (en) Method and system for simultaneously positioning multiple targets on ground plane by unmanned aerial vehicle
CN112528699A (en) Method and system for obtaining identification information of a device or its user in a scene
JP2009301175A (en) Monitoring method
Kang et al. Development of a peripheral-central vision system for small UAS tracking
US10549853B2 (en) Apparatus, system, and method for determining an object's location in image video data
US11415990B2 (en) Optical object tracking on focal plane with dynamic focal length
WO2021212499A1 (en) Target calibration method, apparatus, and system, and remote control terminal of movable platform
US5543910A (en) Passive submarine range finding device and method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23898455

Country of ref document: EP

Kind code of ref document: A1