WO2024043459A1 - Autonomous landing method for drone using object recognition and camera angle control - Google Patents

Autonomous landing method for drone using object recognition and camera angle control Download PDF

Info

Publication number
WO2024043459A1
WO2024043459A1 PCT/KR2023/007725 KR2023007725W WO2024043459A1 WO 2024043459 A1 WO2024043459 A1 WO 2024043459A1 KR 2023007725 W KR2023007725 W KR 2023007725W WO 2024043459 A1 WO2024043459 A1 WO 2024043459A1
Authority
WO
WIPO (PCT)
Prior art keywords
landing
drone
camera
landing pad
center
Prior art date
Application number
PCT/KR2023/007725
Other languages
French (fr)
Korean (ko)
Inventor
신수용
강호현
Original Assignee
금오공과대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 금오공과대학교 산학협력단 filed Critical 금오공과대학교 산학협력단
Publication of WO2024043459A1 publication Critical patent/WO2024043459A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D45/00Aircraft indicators or protectors not otherwise provided for
    • B64D45/04Landing aids; Safety measures to prevent collision with earth's surface
    • B64D45/08Landing aids; Safety measures to prevent collision with earth's surface optical
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U70/00Launching, take-off or landing arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]

Definitions

  • the present invention relates to an autonomous landing method for a drone, and more specifically, to an autonomous landing method for a drone using object recognition and camera angle control.
  • GPS-based landing and specific markers e.g. Aruco Marker, ArTags, AprilTags, etc.
  • the present invention was proposed to solve the above technical problems, and provides an autonomous landing method for a drone that can attempt a landing by finding and recognizing characteristic points of the landing pad even from a distance using object recognition and camera angle control.
  • the drone performs learning in advance to identify the landing pad through a preset deep learning algorithm, and uses the drone's camera in an area adjacent to the landing site.
  • the shooting information of the camera set at a predetermined angle from the ground is compared with the center point of the landing pad's bounding box and kept in the center of the screen.
  • An autonomous landing method for a drone includes the step of changing the angle of the camera and then comparing it with the center point of the landing pad's bounding box based on the camera's shooting information to keep it in the center of the screen and completing the landing. .
  • the drone learns in advance to identify the landing pad through a preset deep learning algorithm, and the drone collects shooting information from the drone's camera in an area adjacent to the landing point.
  • the landing pad is compared with the center point of the bounding box and is kept in the center of the screen, landing diagonally.
  • the drone reaches a certain altitude, it lands while gradually changing the angle of the camera to shoot the ground vertically, and compares it with the center point of the landing pad's bounding box based on the camera's shooting information.
  • An autonomous landing method for a drone is provided, which includes maintaining the drone in the center of the screen and completing the landing.
  • the autonomous landing method of a drone uses object recognition and camera angle control to find and recognize characteristic points of the landing pad even from a distance, enabling a landing attempt.
  • Figure 1 is a diagram showing the diagonal landing path after detection of the landing pad.
  • Figure 2 is an example of aligning the landing pad with the drone based on the center of the camera screen.
  • Figure 3 is an example of landing after moving directly above the landing pad at a specific altitude.
  • Figure 4 is a flow chart of the autonomous landing method of a drone according to the first embodiment.
  • Figure 5 is a flow chart of the autonomous landing method of a drone according to the second embodiment.
  • Figure 1 is a diagram showing the diagonal landing path after detection of the landing pad
  • Figure 2 is an example of aligning the landing pad with the drone based on the center of the camera screen
  • Figure 3 is a diagram showing the landing after moving directly over the landing pad at a specific altitude. This is an example.
  • the present invention proposes a method of identifying the landing point and landing using object recognition (Yolo, mobilenet, etc.) when a drone lands autonomously.
  • the present invention uses object recognition for landing, it has the advantage of autonomy in that the landing pad can be changed into multiple images compared to landing by only recognizing specific markers. In addition, it becomes possible to attempt a landing by finding and recognizing the characteristic points of the landing pad at a greater distance than a specific marker.
  • scenario 1 (first embodiment) is as follows.
  • the drone is equipped with a GPS module and a camera, and the camera angle is set to look 60 degrees downward when landing to have a wider field of view and find the landing pad.
  • the landing pad is recognized by a camera with a wide field of view at a 60-degree angle at the bottom.
  • scenario 2 (second embodiment) is as follows.
  • the drone is equipped with a GPS module and a camera, and the camera angle is set to look 60 degrees downward when landing to have a wider field of view and find the landing pad.
  • the landing pad is recognized by a camera with a wide field of view at a 60-degree angle at the bottom.
  • the camera angle is 90 degrees, looking at the floor, and the drone moves directly above the landing pad. Afterwards, the scenario is completed by landing vertically.
  • scenarios 1 and 2 may be used in combination.
  • Figure 4 is a flowchart of the autonomous landing method of a drone according to the first embodiment.
  • the autonomous landing method of the drone is
  • a step is taken in which the drone learns to identify the landing pad through a preset deep learning algorithm.
  • the drone detects the landing pad and lands based on the shooting information from the drone's camera in an area adjacent to the landing point
  • the bounding box of the landing pad is detected based on the shooting information from the camera set at an angle of 60 degrees from the ground.
  • the landing is carried out in a diagonal direction, keeping it in the center of the screen by comparing it with the center point of .
  • a horizontal movement step is performed by applying the trigonometric method of tan ⁇ , considering the angle and altitude of 60 degrees from the ground.
  • the camera's angle is changed to shoot the ground vertically, and the landing is completed by comparing the camera's shooting information with the center point of the landing pad's bounding box and keeping it in the center of the screen. .
  • Figure 5 is a flowchart of the autonomous landing method of a drone according to the second embodiment.
  • the autonomous landing method of the drone is
  • a step is taken in which the drone learns to identify the landing pad through a preset deep learning algorithm.
  • the drone detects the landing pad and lands based on the shooting information from the drone's camera in an area adjacent to the landing point
  • the bounding box of the landing pad is detected based on the shooting information from the camera set at an angle of 60 degrees from the ground.
  • the landing is carried out in a diagonal direction, keeping it in the center of the screen by comparing it with the center point of .
  • the drone when the drone reaches a certain altitude, it lands while gradually changing the angle of the camera to finally shoot the ground vertically. Based on the camera's shooting information, it compares it with the center point of the landing pad's bounding box and displays the image on the screen. The landing is completed by maintaining the position in the center.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Molecular Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Image Analysis (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

This autonomous landing method for a drone using object recognition and camera angle control comprises the steps of: learning in advance, by a drone, to identify a landing pad through a preset deep learning algorithm; when the drone detects the landing pad and lands on the basis of photographing information obtained by a camera of the drone in an area adjacent to a landing point, maintaining the drone to be positioned at the center of a screen through a comparison with a center point of a bounding box of the landing pad on the basis of the photographing information obtained by the camera set at a certain angle from the ground, and landing in a diagonal direction; when the drone reaches a certain altitude, horizontally moving the drone by applying the trigonometric function of tanθ in consideration of the certain angle from the ground and the altitude; and changing the angle of the camera to photograph the ground vertically, and then maintaining the drone to be positioned at the center of the screen through a comparison with the center point of the bounding box of the landing pad on the basis of the photographing information obtained by the camera, and finishing the landing.

Description

객체인식과 카메라 각도 제어를 사용한 드론의 자율착륙방법Autonomous landing method for drones using object recognition and camera angle control
본 발명은 드론의 자율착륙방법에 관한 것으로서, 더 상세하게는 객체인식과 카메라 각도 제어를 사용한 드론의 자율착륙방법에 관한 것이다.The present invention relates to an autonomous landing method for a drone, and more specifically, to an autonomous landing method for a drone using object recognition and camera angle control.
기존 자율착륙은 GPS를 사용한 착륙과 특정 마커를 사용하여(예 Aruco Marker, ArTags, AprilTags 등) 정확한 측위를 계산하여 착륙하는 방법을 대다수 사용한다. Most existing autonomous landings use GPS-based landing and specific markers (e.g. Aruco Marker, ArTags, AprilTags, etc.) to calculate accurate positioning.
GPS를 사용할 경우 착륙이후 오차가 발생하고, 특정 마커를 사용하는 방법은 거리가 멀 경우 인식을 못하는 경우와 다른 이미지로 대체가 안 된다는 단점이 있다.When using GPS, errors occur after landing, and the method of using a specific marker has the disadvantages of not being able to recognize it when the distance is far and not being able to replace it with another image.
본 발명은 상기와 같은 기술적 과제를 해결하기 위해 제안된 것으로, 객체인식과 카메라 각도 제어를 사용하여 먼 거리에서도 착륙패드의 특징점을 찾아내 인식하여 착륙시도가 가능한 드론의 자율착륙방법을 제공한다.The present invention was proposed to solve the above technical problems, and provides an autonomous landing method for a drone that can attempt a landing by finding and recognizing characteristic points of the landing pad even from a distance using object recognition and camera angle control.
상기 문제점을 해결하기 위한 본 발명의 일 실시예에 따르면, 드론이 미리 설정된 딥러닝 알고리즘을 통해 착륙패드를 식별하기 위한 학습을 사전에 진행하는 단계와, 드론이 착륙지점에 인접한 지역에서 드론의 카메라의 촬영정보를 바탕으로 착륙패드를 탐지하여 착륙을 진행함에 있어서, 지면으로부터 소정의 각도로 설정된 상기 카메라의 촬영정보를 바탕으로 착륙패드의 바운딩 박스의 중심점과 비교를 하여 화면 중앙에 오게끔 유지를 하며 대각선 방향으로 착륙을 진행하는 단계와, 드론이 소정의 고도에 도달할 경우 지면으로부터 상기 소정의 각도 및 고도를 고려하여 tanθ의 삼각함수법을 적용하여 수평이동하는 단계와, 지면을 수직으로 촬영하도록 카메라의 각도를 변경한 후 카메라의 촬영정보를 바탕으로 착륙패드의 바운딩 박스의 중심점과 비교를 하여 화면 중앙에 오게끔 유지를 하며 착륙을 마무리하는 단계를 포함하는 드론의 자율착륙방법이 제공된다.According to an embodiment of the present invention to solve the above problem, the drone performs learning in advance to identify the landing pad through a preset deep learning algorithm, and uses the drone's camera in an area adjacent to the landing site. When detecting the landing pad based on the shooting information and proceeding with the landing, the shooting information of the camera set at a predetermined angle from the ground is compared with the center point of the landing pad's bounding box and kept in the center of the screen. A step of landing in a diagonal direction, and when the drone reaches a predetermined altitude, a step of moving horizontally by applying the trigonometric method of tanθ considering the predetermined angle and altitude from the ground, and taking pictures of the ground vertically. An autonomous landing method for a drone is provided, which includes the step of changing the angle of the camera and then comparing it with the center point of the landing pad's bounding box based on the camera's shooting information to keep it in the center of the screen and completing the landing. .
또한, 본 발명의 다른 실시예에 따르면, 드론이 미리 설정된 딥러닝 알고리즘을 통해 착륙패드를 식별하기 위한 학습을 사전에 진행하는 단계와, 드론이 착륙지점에 인접한 지역에서 드론의 카메라의 촬영정보를 바탕으로 착륙패드를 탐지하여 착륙을 진행함에 있어서, 지면으로부터 소정의 각도로 설정된 카메라의 촬영정보를 바탕으로 착륙패드의 바운딩 박스의 중심점과 비교를 하여 화면 중앙에 오게끔 유지를 하며 대각선 방향으로 착륙을 진행하는 단계와, 드론이 소정의 고도에 도달할 경우 최종적으로 지면을 수직으로 촬영하도록 카메라의 각도를 점진적으로 변경하면서 착륙하되, 카메라의 촬영정보를 바탕으로 착륙패드의 바운딩 박스의 중심점과 비교를 하여 화면 중앙에 오게끔 유지를 하며 착륙을 마무리하는 단계를 포함하는 드론의 자율착륙방법이 제공된다.In addition, according to another embodiment of the present invention, the drone learns in advance to identify the landing pad through a preset deep learning algorithm, and the drone collects shooting information from the drone's camera in an area adjacent to the landing point. When performing a landing by detecting the landing pad based on the shooting information of the camera set at a predetermined angle from the ground, the landing pad is compared with the center point of the bounding box and is kept in the center of the screen, landing diagonally. When the drone reaches a certain altitude, it lands while gradually changing the angle of the camera to shoot the ground vertically, and compares it with the center point of the landing pad's bounding box based on the camera's shooting information. An autonomous landing method for a drone is provided, which includes maintaining the drone in the center of the screen and completing the landing.
본 발명의 실시예에 따른 드론의 자율착륙방법은, 객체인식과 카메라 각도 제어를 사용하여 먼 거리에서도 착륙패드의 특징점을 찾아내 인식하여 착륙시도가 가능하다.The autonomous landing method of a drone according to an embodiment of the present invention uses object recognition and camera angle control to find and recognize characteristic points of the landing pad even from a distance, enabling a landing attempt.
도 1은 착륙패드 감지 이후 대각선 착륙 경로를 나타낸 도면Figure 1 is a diagram showing the diagonal landing path after detection of the landing pad.
도 2는 카메라 화면 중앙을 기준으로 착륙패드를 드론기준으로 정렬하는 예시도Figure 2 is an example of aligning the landing pad with the drone based on the center of the camera screen.
도 3은 특정고도일 때 착륙패드 바로 위로 이동 이후 착륙하는 예시도Figure 3 is an example of landing after moving directly above the landing pad at a specific altitude.
도 4는 제1 실시예에 따른 드론의 자율착륙방법의 순서도Figure 4 is a flow chart of the autonomous landing method of a drone according to the first embodiment.
도 5는 제2 실시예에 따른 드론의 자율착륙방법의 순서도Figure 5 is a flow chart of the autonomous landing method of a drone according to the second embodiment.
이하, 본 발명이 속하는 기술 분야에서 통상의 지식을 가진 자가 본 발명의 기술적 사상을 용이하게 실시할 수 있을 정도로 상세히 설명하기 위하여, 본 발명의 실시예를 첨부한 도면을 참조하여 설명하기로 한다.Hereinafter, in order to explain the present invention in detail so that a person skilled in the art can easily implement the technical idea of the present invention, embodiments of the present invention will be described with reference to the accompanying drawings.
도 1은 착륙패드 감지 이후 대각선 착륙 경로를 나타낸 도면이고, 도 2는 카메라 화면 중앙을 기준으로 착륙패드를 드론기준으로 정렬하는 예시도이고, 도 3은 특정고도일 때 착륙패드 바로 위로 이동 이후 착륙하는 예시도이다.Figure 1 is a diagram showing the diagonal landing path after detection of the landing pad, Figure 2 is an example of aligning the landing pad with the drone based on the center of the camera screen, and Figure 3 is a diagram showing the landing after moving directly over the landing pad at a specific altitude. This is an example.
본 발명은 드론이 자율 착륙할 때 객체인식(Yolo, mobilenet등)을 사용하여 착륙지점을 식별하고 착륙하는 방법을 제안한다.The present invention proposes a method of identifying the landing point and landing using object recognition (Yolo, mobilenet, etc.) when a drone lands autonomously.
본 발명은 객체인식을 하여 착륙한다는 점에서 특정 마커만 인식 하여 착륙하는 것에 비해 착륙패드를 여러 이미지로 바꿀 수 있는 자율성 장점이 생긴다. 또한, 특정 마커에 비해 보다 먼 거리에서도 착륙패드의 특징점을 찾아내 인식하여 착륙시도가 가능해 진다.In that the present invention uses object recognition for landing, it has the advantage of autonomy in that the landing pad can be changed into multiple images compared to landing by only recognizing specific markers. In addition, it becomes possible to attempt a landing by finding and recognizing the characteristic points of the landing pad at a greater distance than a specific marker.
우선, 시나리오 1(제1 실시예)은 다음과 같다.First, scenario 1 (first embodiment) is as follows.
드론은 GPS 모듈과 카메라가 장착되어 있고, 카메라의 각도는 보다 넓은 시야를 가져 착륙패드를 찾기 위해 착륙 시 하단 60도를 바라보게 세팅되어 있다.The drone is equipped with a GPS module and a camera, and the camera angle is set to look 60 degrees downward when landing to have a wider field of view and find the landing pad.
(1) 임무를 수행한 드론이 귀환지점 GPS 좌표로 돌아온다.(1) The drone that has completed its mission returns to its return point GPS coordinates.
(2) 하단 60도 각도로 넓은 시야를 가진 카메라로 착륙패드를 인식한다.(2) The landing pad is recognized by a camera with a wide field of view at a 60-degree angle at the bottom.
(3) 도 1과 같이 착륙패드까지 직선거리, 즉 대각선방향으로 착륙한다. 이때 도 2와 같이 카메라화면 중심에 가상의 선을 그리고 착륙패드의 바운딩 박스의 중심점과 비교를 하여 화면 중앙에 오게끔 유지를 하며 착륙을 한다.(3) Land in a straight line, that is, diagonally, to the landing pad as shown in Figure 1. At this time, draw an imaginary line at the center of the camera screen as shown in Figure 2, compare it with the center point of the landing pad's bounding box, and land by keeping it in the center of the screen.
(4) 착륙 중 착륙패드와 너무 가까워지면 화면에 가려져서 착륙패드가 식별이 불가능해진다. 그래서 특정고도가 되면(여기에서는 3m) 도 3과 같이 삼각함수법에 의해 현재 고도*tanθ만큼 앞으로 이동하여 착륙패드 바로 위로 이동 후 카메라의 각도를 90도(지면을 수직으로 촬영하는 각도)로 변경하여 카메라를 하단으로, 즉 바닥을 바라보게끔 카메라의 각도를 조정한다. 이후 다시 도 2와 같이 착륙패드와 드론을 계속 정렬하면서 착륙을 진행하여 착륙을 완료한다.(4) If you get too close to the landing pad during landing, it will be obscured by the screen and the landing pad will become unidentifiable. So, when it reaches a certain altitude (3m in this case), it moves forward by the current altitude * tanθ according to the trigonometry method as shown in Figure 3, moves directly above the landing pad, and changes the camera angle to 90 degrees (the angle of shooting the ground vertically). So, adjust the angle of the camera so that it looks downward, that is, toward the floor. Afterwards, the landing is completed by continuing to align the landing pad and the drone as shown in Figure 2.
다음으로, 시나리오 2(제2 실시예)는 다음과 같다.Next, scenario 2 (second embodiment) is as follows.
드론은 GPS 모듈과 카메라가 장착되어 있고, 카메라의 각도는 보다 넓은 시야를 가져 착륙패드를 찾기 위해 착륙 시 하단 60도를 바라보게 세팅되어 있다.The drone is equipped with a GPS module and a camera, and the camera angle is set to look 60 degrees downward when landing to have a wider field of view and find the landing pad.
(1) 임무를 수행한 드론이 귀환지점 GPS 좌표로 돌아온다.(1) The drone that has completed its mission returns to its return point GPS coordinates.
(2) 하단 60도 각도로 넓은 시야를 가진 카메라로 착륙패드를 인식한다.(2) The landing pad is recognized by a camera with a wide field of view at a 60-degree angle at the bottom.
(3) 도 1과 같이 착륙패드까지 직선거리, 즉 대각선방향으로 착륙한다. 이때 도 2와 같이 카메라화면 중심에 가상의 선을 그리고 착륙패드의 바운딩 박스의 중심점과 비교를 하여 화면 중앙에 오게끔 유지를 하며 착륙을 한다.(3) Land in a straight line, that is, diagonally, to the landing pad as shown in Figure 1. At this time, draw an imaginary line at the center of the camera screen as shown in Figure 2, compare it with the center point of the landing pad's bounding box, and land by keeping it in the center of the screen.
(4) 대각선으로 착륙도중 카메라의 각도를 하단 90도, 바닥을 바라보게끔 조금씩 바꾸면서 착륙을 진행한다.(4) During the diagonal landing, proceed with the landing by slightly changing the angle of the camera so that it is facing the bottom 90 degrees and the floor.
(5) 착륙을 진행하다보면 카메라의 각도가 90도, 바닥을 바라보고 드론은 착륙패드 바로 위에 이동하게 된다. 이후 그대로 수직으로 착륙을 하는 것으로 시나리오가 완료된다.(5) During landing, the camera angle is 90 degrees, looking at the floor, and the drone moves directly above the landing pad. Afterwards, the scenario is completed by landing vertically.
이때, 시나리오 1 및 2가 혼합되어 사용될 수도 있다.At this time, scenarios 1 and 2 may be used in combination.
도 4는 제1 실시예에 따른 드론의 자율착륙방법의 순서도이다.Figure 4 is a flowchart of the autonomous landing method of a drone according to the first embodiment.
도 4를 참조하면, 드론의 자율착륙방법은Referring to Figure 4, the autonomous landing method of the drone is
우선, 드론이 미리 설정된 딥러닝 알고리즘을 통해 착륙패드를 식별하기 위한 학습을 사전에 진행하는 단계가 진행된다.First, a step is taken in which the drone learns to identify the landing pad through a preset deep learning algorithm.
다음으로, 드론이 착륙지점에 인접한 지역에서 드론의 카메라의 촬영정보를 바탕으로 착륙패드를 탐지하여 착륙을 진행함에 있어서, 지면으로부터 60도 각도로 설정된 카메라의 촬영정보를 바탕으로 착륙패드의 바운딩 박스의 중심점과 비교를 하여 화면 중앙에 오게끔 유지를 하며 대각선 방향으로 착륙을 진행하는 단계가 진행된다.Next, when the drone detects the landing pad and lands based on the shooting information from the drone's camera in an area adjacent to the landing point, the bounding box of the landing pad is detected based on the shooting information from the camera set at an angle of 60 degrees from the ground. The landing is carried out in a diagonal direction, keeping it in the center of the screen by comparing it with the center point of .
다음으로, 드론이 소정의 고도에 도달할 경우 지면으로부터 60도 각도 및 고도를 고려하여 tanθ의 삼각함수법을 적용하여 수평이동하는 단계가 진행된다.Next, when the drone reaches a predetermined altitude, a horizontal movement step is performed by applying the trigonometric method of tanθ, considering the angle and altitude of 60 degrees from the ground.
다음으로, 지면을 수직으로 촬영하도록 카메라의 각도를 변경한 후 카메라의 촬영정보를 바탕으로 착륙패드의 바운딩 박스의 중심점과 비교를 하여 화면 중앙에 오게끔 유지를 하며 착륙을 마무리하는 단계가 진행된다.Next, the camera's angle is changed to shoot the ground vertically, and the landing is completed by comparing the camera's shooting information with the center point of the landing pad's bounding box and keeping it in the center of the screen. .
도 5는 제2 실시예에 따른 드론의 자율착륙방법의 순서도이다.Figure 5 is a flowchart of the autonomous landing method of a drone according to the second embodiment.
도 5를 참조하면, 드론의 자율착륙방법은Referring to Figure 5, the autonomous landing method of the drone is
우선, 드론이 미리 설정된 딥러닝 알고리즘을 통해 착륙패드를 식별하기 위한 학습을 사전에 진행하는 단계가 진행된다.First, a step is taken in which the drone learns to identify the landing pad through a preset deep learning algorithm.
다음으로, 드론이 착륙지점에 인접한 지역에서 드론의 카메라의 촬영정보를 바탕으로 착륙패드를 탐지하여 착륙을 진행함에 있어서, 지면으로부터 60도 각도로 설정된 카메라의 촬영정보를 바탕으로 착륙패드의 바운딩 박스의 중심점과 비교를 하여 화면 중앙에 오게끔 유지를 하며 대각선 방향으로 착륙을 진행하는 단계가 진행된다.Next, when the drone detects the landing pad and lands based on the shooting information from the drone's camera in an area adjacent to the landing point, the bounding box of the landing pad is detected based on the shooting information from the camera set at an angle of 60 degrees from the ground. The landing is carried out in a diagonal direction, keeping it in the center of the screen by comparing it with the center point of .
다음으로, 드론이 소정의 고도에 도달할 경우 최종적으로 지면을 수직으로 촬영하도록 카메라의 각도를 점진적으로 변경하면서 착륙하되, 카메라의 촬영정보를 바탕으로 착륙패드의 바운딩 박스의 중심점과 비교를 하여 화면 중앙에 오게끔 유지를 하며 착륙을 마무리하는 단계가 진행된다.Next, when the drone reaches a certain altitude, it lands while gradually changing the angle of the camera to finally shoot the ground vertically. Based on the camera's shooting information, it compares it with the center point of the landing pad's bounding box and displays the image on the screen. The landing is completed by maintaining the position in the center.
이와 같이, 본 발명이 속하는 기술분야의 당업자는 본 발명이 그 기술적 사상이나 필수적 특징을 변경하지 않고서 다른 구체적인 형태로 실시될 수 있다는 것을 이해할 수 있을 것이다. 그러므로 이상에서 기술한 실시예들은 모든 면에서 예시적인 것이며 한정적인 것이 아닌 것으로서 이해해야만 한다. 본 발명의 범위는 상기 상세한 설명보다는 후술하는 특허청구범위에 의하여 나타내어지며, 특허청구범위의 의미 및 범위 그리고 그 등가개념으로부터 도출되는 모든 변경 또는 변형된 형태가 본 발명의 범위에 포함되는 것으로 해석되어야 한다.As such, a person skilled in the art to which the present invention pertains will understand that the present invention can be implemented in other specific forms without changing its technical idea or essential features. Therefore, the embodiments described above should be understood in all respects as illustrative and not restrictive. The scope of the present invention is indicated by the claims described below rather than the detailed description above, and all changes or modified forms derived from the meaning and scope of the claims and their equivalent concepts should be interpreted as being included in the scope of the present invention. do.

Claims (2)

  1. 드론이 미리 설정된 딥러닝 알고리즘을 통해 착륙패드를 식별하기 위한 학습을 사전에 진행하는 단계;A step in which the drone learns in advance to identify the landing pad through a preset deep learning algorithm;
    상기 드론이 착륙지점에 인접한 지역에서 상기 드론의 카메라의 촬영정보를 바탕으로 상기 착륙패드를 탐지하여 착륙을 진행함에 있어서, 지면으로부터 소정의 각도로 설정된 상기 카메라의 촬영정보를 바탕으로 상기 착륙패드의 바운딩 박스의 중심점과 비교를 하여 화면 중앙에 오게끔 유지를 하며 대각선 방향으로 착륙을 진행하는 단계;When the drone detects the landing pad and lands based on the shooting information of the drone's camera in an area adjacent to the landing point, the landing pad is detected based on the shooting information of the camera set at a predetermined angle from the ground. Comparing the center point of the bounding box to maintaining it in the center of the screen and performing a landing in a diagonal direction;
    상기 드론이 소정의 고도에 도달할 경우 지면으로부터 상기 소정의 각도 및 상기 고도를 고려하여 tanθ의 삼각함수법을 적용하여 수평이동하는 단계; 및When the drone reaches a predetermined altitude, moving horizontally by applying the trigonometric method of tanθ in consideration of the predetermined angle from the ground and the altitude; and
    지면을 수직으로 촬영하도록 상기 카메라의 각도를 변경한 후 상기 카메라의 촬영정보를 바탕으로 상기 착륙패드의 바운딩 박스의 중심점과 비교를 하여 화면 중앙에 오게끔 유지를 하며 착륙을 마무리하는 단계;changing the angle of the camera to capture the ground vertically, then comparing the center point of the bounding box of the landing pad based on the shooting information of the camera and maintaining it at the center of the screen to complete the landing;
    를 포함하는 드론의 자율착륙방법.Autonomous landing method of a drone including.
  2. 드론이 미리 설정된 딥러닝 알고리즘을 통해 착륙패드를 식별하기 위한 학습을 사전에 진행하는 단계;A step in which the drone learns in advance to identify the landing pad through a preset deep learning algorithm;
    상기 드론이 착륙지점에 인접한 지역에서 상기 드론의 카메라의 촬영정보를 바탕으로 상기 착륙패드를 탐지하여 착륙을 진행함에 있어서, 지면으로부터 소정의 각도로 설정된 상기 카메라의 촬영정보를 바탕으로 상기 착륙패드의 바운딩 박스의 중심점과 비교를 하여 화면 중앙에 오게끔 유지를 하며 대각선 방향으로 착륙을 진행하는 단계; 및When the drone detects the landing pad and lands based on the shooting information of the drone's camera in an area adjacent to the landing point, the landing pad is detected based on the shooting information of the camera set at a predetermined angle from the ground. Comparing the center point of the bounding box to maintaining it in the center of the screen and performing a landing in a diagonal direction; and
    상기 드론이 소정의 고도에 도달할 경우 최종적으로 지면을 수직으로 촬영하도록 상기 카메라의 각도를 점진적으로 변경하면서 착륙하되, 상기 카메라의 촬영정보를 바탕으로 상기 착륙패드의 바운딩 박스의 중심점과 비교를 하여 화면 중앙에 오게끔 유지를 하며 착륙을 마무리하는 단계;When the drone reaches a predetermined altitude, it lands while gradually changing the angle of the camera to finally capture the ground vertically, and compares it with the center point of the bounding box of the landing pad based on the camera's shooting information. Finalizing the landing by keeping it in the center of the screen;
    를 포함하는 드론의 자율착륙방법.Autonomous landing method of a drone including.
PCT/KR2023/007725 2022-08-26 2023-06-07 Autonomous landing method for drone using object recognition and camera angle control WO2024043459A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020220107280A KR20240029158A (en) 2022-08-26 2022-08-26 Autonomous landing method of drone using object recognition and camera angle control
KR10-2022-0107280 2022-08-26

Publications (1)

Publication Number Publication Date
WO2024043459A1 true WO2024043459A1 (en) 2024-02-29

Family

ID=90013436

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2023/007725 WO2024043459A1 (en) 2022-08-26 2023-06-07 Autonomous landing method for drone using object recognition and camera angle control

Country Status (2)

Country Link
KR (1) KR20240029158A (en)
WO (1) WO2024043459A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100027747A (en) * 2008-09-03 2010-03-11 한국항공우주연구원 Automatic landing system and control method using circular image data for aircraft
KR102227740B1 (en) * 2019-07-17 2021-03-15 한국항공우주산업 주식회사 Landing Support System for Vertical Takeoff and Landing Type PAV
KR102307584B1 (en) * 2021-03-31 2021-09-30 세종대학교산학협력단 System for autonomous landing control of unmanned aerial vehicle
KR20220003851A (en) * 2020-07-02 2022-01-11 중앙대학교 산학협력단 Autonomous landing system of UAV on moving platform
KR20220100768A (en) * 2021-01-08 2022-07-18 아주대학교산학협력단 Apparatus and method for controlling landing based on image learning

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102448723B1 (en) 2020-11-23 2022-09-30 주식회사 해양드론기술 Autonomous Landing System

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100027747A (en) * 2008-09-03 2010-03-11 한국항공우주연구원 Automatic landing system and control method using circular image data for aircraft
KR102227740B1 (en) * 2019-07-17 2021-03-15 한국항공우주산업 주식회사 Landing Support System for Vertical Takeoff and Landing Type PAV
KR20220003851A (en) * 2020-07-02 2022-01-11 중앙대학교 산학협력단 Autonomous landing system of UAV on moving platform
KR20220100768A (en) * 2021-01-08 2022-07-18 아주대학교산학협력단 Apparatus and method for controlling landing based on image learning
KR102307584B1 (en) * 2021-03-31 2021-09-30 세종대학교산학협력단 System for autonomous landing control of unmanned aerial vehicle

Also Published As

Publication number Publication date
KR20240029158A (en) 2024-03-05

Similar Documents

Publication Publication Date Title
CN111080679B (en) Method for dynamically tracking and positioning indoor personnel in large-scale place
WO2019225817A1 (en) Vehicle position estimation device, vehicle position estimation method, and computer-readable recording medium for storing computer program programmed to perform said method
CN103175524B (en) A kind of position of aircraft without view-based access control model under marking environment and attitude determination method
CN106153050A (en) A kind of indoor locating system based on beacon and method
CN112036210B (en) Method and device for detecting obstacle, storage medium and mobile robot
US20160238394A1 (en) Device for Estimating Position of Moving Body and Method for Estimating Position of Moving Body
US20190197908A1 (en) Methods and systems for improving the precision of autonomous landings by drone aircraft on landing targets
CN109885086A (en) A kind of unmanned plane vertical landing method based on the guidance of multiple polygonal shape mark
WO2020036295A1 (en) Apparatus and method for acquiring coordinate conversion information
CN110188749A (en) Designated vehicle Vehicle License Plate Recognition System and method under a kind of more vehicles
CN107783555B (en) Target positioning method, device and system based on unmanned aerial vehicle
CN106370160A (en) Robot indoor positioning system and method
WO2020159076A1 (en) Landmark location estimation apparatus and method, and computer-readable recording medium storing computer program programmed to perform method
JP6856855B2 (en) A method for correcting misalignment of a camera by selectively using information generated by itself and information generated by another individual, and a device using this.
CN106871906A (en) A kind of blind man navigation method, device and terminal device
CN104331884B (en) The stair climbing parameter acquiring system of four feeler caterpillar robots
CN111754551B (en) Target tracking method, device, system, equipment and storage medium
CN112836634B (en) Multi-sensor information fusion gate anti-trailing method, device, equipment and medium
CN105096327B (en) A kind of sidewalk for visually impaired people localization method based on computer binocular vision and homography matrix
KR20160102844A (en) System and method for guiding landing of multi-copter
WO2024043459A1 (en) Autonomous landing method for drone using object recognition and camera angle control
CN109801336A (en) Airborne target locating system and method based on visible light and infrared light vision
CN108460972A (en) A kind of more car plates based on unmanned plane independently detection and positioning system and method
WO2011078596A2 (en) Method, system, and computer-readable recording medium for adaptively performing image-matching according to conditions
CN114511592A (en) Personnel trajectory tracking method and system based on RGBD camera and BIM system