US20200208970A1 - Method and device for movable object distance detection, and aerial vehicle - Google Patents

Method and device for movable object distance detection, and aerial vehicle Download PDF

Info

Publication number
US20200208970A1
US20200208970A1 US16/817,205 US202016817205A US2020208970A1 US 20200208970 A1 US20200208970 A1 US 20200208970A1 US 202016817205 A US202016817205 A US 202016817205A US 2020208970 A1 US2020208970 A1 US 2020208970A1
Authority
US
United States
Prior art keywords
distance
aerial vehicle
distance value
target object
feature points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/817,205
Inventor
Zhenyu Zhu
Zihan Chen
Zhiyuan Zhang
Weifeng Liu
Chaobin Chen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Priority to US16/817,205 priority Critical patent/US20200208970A1/en
Assigned to SZ DJI Technology Co., Ltd. reassignment SZ DJI Technology Co., Ltd. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHEN, CHAOBIN, CHEN, ZIHAN, LIU, WEIFENG, ZHANG, ZHIYUAN, ZHU, ZHENYU
Publication of US20200208970A1 publication Critical patent/US20200208970A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • G01C11/08Interpretation of pictures by comparison of two or more pictures of the same area the pictures not being supported in the same relative position as when they were taken
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • G01C3/08Use of electric radiation detectors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/18Stabilised platforms, e.g. by gyroscope
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/10Measuring distances in line of sight; Optical rangefinders using a parallactic triangle with variable angles and a base of fixed length in the observation station, e.g. in the instrument
    • G01C3/14Measuring distances in line of sight; Optical rangefinders using a parallactic triangle with variable angles and a base of fixed length in the observation station, e.g. in the instrument with binocular observation at a single point, e.g. stereoscopic type
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0094Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G06K9/0063
    • G06K9/209
    • G06K9/4671
    • G06K9/6202
    • G06K9/78
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/292Multi-camera tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0004Transmission of traffic-related information to or from an aircraft
    • G08G5/0008Transmission of traffic-related information to or from an aircraft with other aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0004Transmission of traffic-related information to or from an aircraft
    • G08G5/0013Transmission of traffic-related information to or from an aircraft with a ground station
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0017Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information
    • G08G5/0021Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information located in the aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0017Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information
    • G08G5/0026Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information located on the ground
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0069Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0073Surveillance aids
    • G08G5/0078Surveillance aids for monitoring traffic from the aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0073Surveillance aids
    • G08G5/0082Surveillance aids for monitoring traffic from a ground station
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/04Anti-collision systems
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/04Anti-collision systems
    • G08G5/045Navigation or guidance aids, e.g. determination of anti-collision manoeuvers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C9/00Measuring inclination, e.g. by clinometers, by levels
    • G01C9/005Measuring inclination, e.g. by clinometers, by levels specially adapted for use in aircraft
    • G06K2209/21
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30261Obstacle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Definitions

  • the present disclosure relates to the technical field of electronics, and more particularly to a method and device for movable object distance detection, and an aerial vehicle.
  • remote control intelligent movable apparatuses such as remote control robots, remote control cars, unmanned aerial vehicles, can perform reconnaissance and surveillance tasks for a wide variety of military and civilian applications.
  • a distance from the movable apparatus to an object can be detected during a flight of the movable apparatus to detect or track the object in real time.
  • the prior detecting distance is not accurate, leading to an inaccurate determination of the movable apparatus and inaccurate detecting or tracking.
  • Embodiments of the present disclosure provide a method and device for movable object distance detection, and an aerial vehicle which enable a more accurate distance detection.
  • An aspect of the disclosure provides a method for movable object distance detection, the method comprising detecting a first distance value between the movable object and a target object, the target object being positioned in a target direction of the movable object; obtaining an inclination angle of the movable object at the target direction, and calculating a second distance value from the movable object to the target object based upon the inclination angle and the first distance value; and determining the second distance value as an actual distance between the movable object and the target object.
  • the method can further comprise controlling a movement of the movable object based upon the second distance value.
  • detecting a first distance value between the movable object and the target object can comprise obtaining the first distance value between the movable object and the target object based upon distance sensing data of a distance sensing.
  • detecting a first distance value between the movable object and the target object can comprise obtaining the first distance value between the movable object and the target object based upon visual sensing data of a visual sensing.
  • obtaining the first distance value between the movable object and the target object based upon visual sensing data of a visual sensing can comprise obtaining at least two images in the target direction; determining feature points in each of the at least two images, comparing feature points and determining correlated feature points between the at least two images; and calculating the first distance value between the movable object and the target object based upon the correlated feature points.
  • determining feature points in each of the at least two images can comprise obtaining an inclination angle of the movable object at the target direction; and determining an effective area in each of the at least two images based upon the inclination angle, and determining a feature point in an effective area of each of the at least two images.
  • the correlated feature points can be a group of sparse feature points
  • the calculating the first distance value between the movable object and the target object based upon the correlated feature points can comprise calculating the first distance value between the movable object and the target object based upon the sparse feature points.
  • the correlated feature points can be a group of dense feature points
  • the calculating the first distance value between the movable object and the target object based upon the correlated feature points can comprise calculating the first distance value between the movable object and the target object based upon the dense feature points
  • the first distance value can be an average value of a plurality of distance values between the movable object and the target object.
  • the detecting a first distance value between the movable object and a target object can comprise: obtaining distance sensing data sensed by a distance sensing process, and calculating a distance value from the distance sensing data; if the calculated distance value is less than a preset distance threshold, determining the calculated distance value as the first distance value; if the calculated distance value is larger than the preset distance threshold, determining a distance value between the movable object and the target object sensed by a visual sensing process as the first distance value.
  • controlling a movement of the movable object based upon the actual distance can comprise performing an obstacle avoidance based upon the actual distance.
  • the obstacle avoidance can comprise limiting a moving speed or avoiding the target object.
  • the controlling a movement of the movable object based upon the actual distance can comprise performing a target tracking based upon the actual distance.
  • the target tracking can comprise controlling a movement of the movable object such that a distance between the movable object and the target object can be within a predetermined tracking distance threshold.
  • Another aspect of the disclosure further provides a device for movable object distance detection, the device comprising: a detection module configured to detect a first distance value between the movable object and a target object, the target object being positioned in a target direction of the movable object; and a processing module configured to obtain an inclination angle of the movable object at the target direction, calculate a second distance value from the movable object to the target object based upon the inclination angle and the first distance value, and determine the second distance value as an actual distance between the movable object and the target object.
  • the device can further comprise a control module configured to control a movement of the movable object based upon the actual distance.
  • the detection module can be configured to obtain the first distance value between the movable object and the target object based upon distance sensing data of a distance sensing.
  • the detection module can be configured to obtain the first distance value between the movable object and the target object based upon visual sensing data of a visual sensing.
  • the detection module can comprise: a capturing unit configured to obtain at least two images in the target direction; a comparing unit configured to determine feature points in each of the at least two images and compare feature points and determine correlated feature points between the at least two images; and a determining unit configured to calculate the first distance value between the movable object and the target object based upon the correlated feature points.
  • the comparing unit can be configured to obtain an inclination angle of the movable object at the target direction, determine an effective area in each of the at least two images based upon the inclination angle, and determine a feature point in an effective area of each of the at least two images.
  • the determined feature points can be a group of sparse feature points.
  • the determining unit can be configured to calculate the first distance value between the movable object and the target object based upon the sparse feature points.
  • the determined feature points are a group of dense feature points.
  • the determining unit can be configured to calculate the first distance value between the movable object and the target object based upon the dense feature points.
  • the first distance value can be an average value of a plurality of distance values between the movable object and the target object.
  • the detection module can comprise: a first obtaining unit configured to obtain distance sensing data sensed by a distance sensing process and calculate a distance value from the distance sensing data; a distance value determining unit configured to, if the calculated distance value is less than a preset distance threshold, set the calculated distance value as a first distance value; and a second obtaining unit configured to, if the calculated distance value is larger than the preset distance threshold, set a distance value between the movable object and the target object sensed by a visual sensing process as the first distance value.
  • control module can be configured to perform an obstacle avoidance based upon the actual distance.
  • the obstacle avoidance can comprise limiting a moving speed or avoiding the target object.
  • control module can be configured to perform a target tracking based upon the actual distance.
  • the target tracking operation can comprise controlling a movement of the movable object such that a distance between the movable object and the target object can be within a predetermined tracking distance threshold.
  • a yet aspect of the disclosure further provides an aerial vehicle, the aerial vehicle comprising a propulsion device, a flight controller and a sensor.
  • the sensor can be configured to sense distance data from the aerial vehicle to a target object.
  • the flight controller can be configured to determine a first distance value from the aerial vehicle to a target object based upon the distance data of the sensor, the target object being positioned in a target direction of the aerial vehicle, obtain an inclination angle of the aerial vehicle at the target direction, and calculate a second distance value from the aerial vehicle to the target object based upon the inclination angle and the first distance value.
  • the flight controller can be further configured to provide a control instruction to the propulsion device to control the propulsion device to effect a movement the aerial vehicle based upon the second distance value.
  • a still yet aspect of the disclosure further provides an aerial vehicle, the aerial vehicle comprising a propulsion device, a flight controller and a sensor.
  • the sensor can be configured to detect a first distance value from the aerial vehicle to a target object, the target object being positioned in a target direction of the movable object.
  • the flight controller can be configured to obtain an inclination angle of the aerial vehicle at the target direction, and calculate a second distance value from the aerial vehicle to the target object based upon the inclination angle and the first distance value.
  • the flight controller can be further configured to provide a control instruction to the propulsion device to control the propulsion device to effect a movement of the aerial vehicle based upon the second distance value.
  • Embodiments of the disclosure can correct a directly measured distance value to obtain a more accurate distance value, thereby enabling an accurate movement control of the movable object.
  • FIG. 1 shows a flowchart of a method for movable object distance detection in accordance with an embodiment of the disclosure.
  • FIG. 2 shows a schematic view of determining a distance between a movable object and a target object in accordance with an embodiment of the disclosure.
  • FIG. 3 shows a flowchart of a method for movable object distance detection in accordance with another embodiment of the disclosure.
  • FIG. 4 shows a flowchart of a method of measuring a distance value in accordance with an embodiment of the disclosure.
  • FIG. 5 shows a structure of a sensor assembly sensing a distance in accordance with an embodiment of the disclosure.
  • FIG. 6 shows a distance measurement using a binocular camera in accordance with an embodiment of the disclosure.
  • FIG. 7 shows a calculation in distance measurement using a binocular camera in accordance with an embodiment of the disclosure.
  • FIG. 8 shows a structure of a device for detecting distance for a movable object in accordance with an embodiment of the disclosure.
  • FIG. 9 shows a structure of a detection module in accordance with an embodiment of the disclosure.
  • FIG. 10 shows a structure of a detection module in accordance with another embodiment of the disclosure.
  • FIG. 11 shows a structure of an aerial vehicle in accordance with an embodiment of the disclosure.
  • FIG. 12 shows a structure of an aerial vehicle in accordance with another embodiment of the disclosure.
  • a directly detected distance can be corrected with an inclination angle of the movable object. Therefore, a more accurate distance value can be obtained, enabling an accurate movement control including an obstacle avoidance and a tracking based on the distance value.
  • FIG. 1 shows a flowchart of a method for movable object distance detection in accordance with an embodiment of the disclosure.
  • the method can be performed by a processor.
  • the method can be performed by a movement controller onboard a movable object, such as a flight controller onboard an unmanned aerial vehicle.
  • the method can comprise steps S 101 to S 103 .
  • step S 101 a process of detecting a first distance value between the movable object and a target object can be performed.
  • the target object can be positioned in a target direction of the movable object.
  • the target direction can be a dead ahead in a movement of the movable object.
  • the target direction can be a direction A (e.g., a flight direction) as shown in FIG. 2 .
  • the movable object can be provided with a monocular or binocular visual sensor, a distance sensor such as an ultrasonic sensor, an infrared sensor, a radar sensor and/or a laser sensor, and other sensor capable of sensing a distance.
  • the first distance value D 1 can be calculated from a sensed data provided by the visual sensor and/or distance sensor as discussed hereinabove.
  • the visual sensor or distance sensor can be configured to sense and calculate a distance values to an area, and distances from the movable object to a plurality of feature points on a given plane of the target object can be obtained. Therefore, the first distance value D 1 can be a special value. In some instances, the first distance value D 1 can be an average of distance values from the movable object to a plurality of points. Optionally, the first distance value D 1 can be a distance value from the movable object to a feature point among a plurality having a most prominent feature, thus the distance value can be the most accurate distance value.
  • the first distance value D 1 between the movable object and the target object can be detected and obtained based upon distance sensing data of a distance sensing.
  • the first distance value D 1 between the movable object and the target object can be detected and obtained based upon an visual sensing data of a visual sensing.
  • step S 102 a process of obtaining an inclination angle of the movable object at the target direction and calculating a second distance value from the movable object to the target object based upon the obtained inclination angle and the first distance value can be performed.
  • the inclination angle a can be measured using a sensor onboard the movable object, such as a gyroscope or an accelerometer.
  • the inclination angle a can be an intersection angle between a moving direction of the movable object and a horizontal plane.
  • the inclination angle a can be an intersection angle between the target direction and the horizontal plane.
  • an aerial vehicle can be inclined as shown in FIG. 2 when the aerial vehicle is travelling forward.
  • the inclination and rotation can be sensed by a sensor onboard the aerial vehicle to obtain the inclination angle a.
  • a computation can be performed to the first distance value D 1 and the inclination angle using trigonometric functions, such that the second distance value D 2 can be obtained by correcting the first distance value D 1 .
  • step S 103 a process of determining the second distance value as an actual distance between the movable object and the target object can be performed.
  • the actual distance value can be directly provided to a connected controller, such that the controller can direct a movement of the movable object based upon the actual distance value.
  • the actual distance can be provided to a user, such that the user can view the actual distance.
  • a movement of the movable object can be controlled based upon the actual distance.
  • the movable object can be controlled to effect a manipulation including an obstacle avoidance and a tracking based upon the actual distance D 2 .
  • a movement of the movable object can be stopped to avoid a crashing onto the target object.
  • a new flight route can be planned to bypass the target object.
  • a further movement of the movable object can be necessary to keep the target object within the distance threshold to enable a surveillance.
  • the visual sensor or distance sensor can measure a distance from the sensor to the target object. Therefore, a predetermined value (e.g., a value D) can be added to the distance value which is calculated from a sensed data.
  • a predetermined value e.g., a value D
  • the distance value calculated from the sensed data can be used directly as the first distance value D 1 between the movable object and the target object, and a distance threshold can be configured in view of a fixed value to effect a movement control.
  • a directly detected distance value can be corrected to obtain a more accurate distance value, thereby enabling a more accurate control on a movement of the movable object.
  • FIG. 3 shows a flowchart of a method for movable object distance detection in accordance with another embodiment of the disclosure.
  • the method can be performed by a processor.
  • the method can be performed by a movement controller onboard a movable object, such as a flight controller onboard an unmanned aerial vehicle.
  • the method can comprise steps S 301 to S 306 .
  • step S 301 a process of obtaining distance sensing data by a distance sensing process and processing the distance sensing data to calculate a distance value can be performed.
  • the distance sensing process can comprise effecting a distance sensing using a sensed data of a distance sensor.
  • the distance sensor can comprise an ultrasonic sensor, an infrared sensor, a radar sensor and/or a laser sensor.
  • step S 302 a process of determining whether the calculated distance value is less than a preset distance threshold can be performed.
  • the distance threshold can be set based upon an effective detection range of a distance sensor such as an ultrasonic sensor.
  • the effective detection range of a general ultrasound can be about 3 meters.
  • the preset distance threshold can be set as 3 meters if an ultrasound is used as the distance sensor for sensing a distance.
  • a distance value sensed by the ultrasonic sensor can be set as a first distance value if the distance value sensed by the ultrasonic sensor is less than 3 meters. That is, a step S 303 can be performed. Otherwise, a step S 304 in which a visual sensor is used can be performed if the distance value sensed by the ultrasonic sensor is not less than 3 meters.
  • step S 303 a process of determining the distance value sensed by the ultrasonic sensor as a first distance value can be performed if the calculated distance value is less than the preset distance threshold.
  • step S 304 a process of obtaining a distance value between the movable object and the target object by a visual sensing process and determining the distance value as the first distance value can be performed if the calculated distance value is larger than the preset distance threshold.
  • step S 305 a process of obtaining an inclination angle of the movable object at a target direction and calculating a second distance value from the movable object to the target object based upon the obtained inclination angle and the first distance value can be performed.
  • the inclination angle can be measured using a sensor onboard the movable object, such as a gyroscope or an accelerometer.
  • the inclination angle can be an intersection angle between a moving direction of the movable object and a horizontal plane. In other words, the inclination angle can be an intersection angle between the target direction and the horizontal plane.
  • an aerial vehicle can be inclined as shown in FIG. 2 when the aerial vehicle is travelling forward. The inclination and rotation can be sensed by a sensor onboard the aerial vehicle to obtain the inclination angle.
  • a computation can be performed to the first distance value and the inclination angle using trigonometric functions, such that the second distance value can be obtained by correcting the first distance value .
  • a formula D 2 cos(a)*D 1 can be used, where D 2 being the second distance value, D 1 being the first distance value, and a being the inclination angle.
  • step S 306 a process of determining the second distance value as an actual distance between the movable object and the target object can be performed.
  • a movement of the movable object can be controlled based upon the actual distance.
  • Controlling the movement of the movable object based upon the actual distance can comprise performing an obstacle avoidance based upon the actual distance.
  • the obstacle avoidance can comprise limiting a moving speed or avoiding a target object.
  • a flight speed of the movable object can be slowed down until reaching a predefined minimum speed if the movable object getting closer to the target object.
  • a current moving speed of the movable object and a distance for stopping the movable object from the current moving speed also referred to as a “stopping distance”
  • the movable object detects a target object in the front and if a distance to the target object is shorter than a distance threshold that is a sum of the stopping distance and a safe distance, the movable object can enter into an emergent braking state to prevent avoid crashing onto the obstacle, e.g., the target object.
  • the aircraft e.g., the movable object can automatically retreat to a position outside the safe distance.
  • a speed of the retreating can be faster if the distance to the detected target object is smaller.
  • Controlling the movement of the movable object based upon the actual distance can further comprise tracking a target based upon the actual distance.
  • tracking a target can comprise controlling a movement of the movable object such that a distance between the movable object and the target object is within a predetermined tracking distance threshold.
  • FIG. 4 shows a flowchart of a method of measuring a distance value in accordance with an embodiment of the disclosure.
  • the first distance value as discussed hereinabove can be calculated using a binocular visual sensor.
  • the method can comprise steps S 401 to S 403 .
  • step S 401 a process of obtaining at least two images in a target direction can be performed.
  • FIG. 5 shows a configuration of a sensor assembly capable of sensing a distance which can comprise two visual sensors (e.g., a camera 501 and a camera 502 ).
  • FIG. 5 shows distance sensors such as an ultrasonic probe 503 and an ultrasonic probe 504 .
  • the sensor assembly can be provided on each surface of the movable object, such that a distance value from the movable object to the target object in every possible moving direction of the movable object can be measured.
  • a pixel can be identified in an image captured by camera 502 corresponding to a pixel in an image captured by camera 501 , and vice versa.
  • the first distance value can be calculated from two corresponding pixels, coordinates of the pixels in respect image, a focal length of each camera and a distance between the two cameras.
  • step S 402 a process of determining feature points in each of the at least two images, comparing the feature points and determining correlated feature points between the at least two images can be performed.
  • Pixels having prominent characteristics can be extracted from images captured by the binocular visual.
  • a process of extracting a feature point can be implemented using an algorithm such as FAST (features from accelerated segment test), SIFT (scale-invariant feature transform) or SURF (speeded up robust features). For instance, a feature point extracted by the FAST algorithm can calculate using a brief (binary robust independent elementary features) operator.
  • Correlated feature points between two images can be determined by a binocular feature point matching.
  • correlated feature points can be determined by comparing a degree of similarity of corresponding pixels between two images using the brief operators of the feature points, which brief operator being calculated using the FAST algorithm.
  • the correlated feature points can comprise pixels of a measured point of the target object in two binocular visual images.
  • step S 403 a process of calculating a first distance value from the movable object to the target object based upon the determined correlated feature points can be performed.
  • a distance value to each feature point can be calculated based upon triangulation, and the first distance value can then be obtained.
  • FIG. 6 and FIG. 7 show a theory of binocular visual distance measurement.
  • the first distance value can be obtained by averaging the distance values obtained from a plurality of measured points,.
  • areas in the two images, in which areas the correlated feature points are positioned can be specified to reduce a computation in determining the correlated feature points.
  • the process of determining feature points in the at least two images can comprise obtaining an inclination angle of the movable object at a target direction, determining an effective area in each of the at least two images based upon the obtained angle and determining a feature point in the effective area of each image.
  • the inclination angle of the movable object can be obtained using an IMU (inertial measurement unit).
  • IMU intial measurement unit
  • a camera can point downward with a movement of the movable object, thus an image captured by the camera can comprise an area including a dead head area of the movable object and an area below the dead head area of the movable object.
  • the dead head area of the movable object can be positioned at an upper portion of the image captured by the camera.
  • a proportion of the dead head area of the movable object in the image can be estimated, such that the effective area can be determined for further calculation.
  • the camera points upward with the movable object, then the dead head area of the movable object can be positioned at a lower portion of the image captured by the camera.
  • the effective area can be similarly determined based upon the inclination angle measured by the IMU.
  • a group of relatively sparse feature points can be obtained using the FAST algorithm.
  • the first distance value from the movable object to the target object can be calculated based upon the determined sparse feature points.
  • a group of dense feature points can be obtained by a block matching in combination of speckled filter.
  • the first distance value from the movable object to the target object can be calculated based upon the determined dense feature points.
  • a directly detected distance value can be corrected to obtain a more accurate distance value, thereby enabling a more accurate control on a movement of the movable object.
  • a detailed description will be provided to a device for detecting distance for a movable object and an aerial vehicle in accordance with embodiments of the disclosure.
  • FIG. 8 shows a structure of a device for detecting distance for a movable object in accordance with an embodiment of the disclosure.
  • the device can be provided in a processor.
  • the device can be provided in movement controller onboard a movable object, such as a flight controller of an aerial vehicle.
  • the device can comprise detection module 1 and a processing module 2 .
  • the detection module 1 can be configured to detect a first distance value between the movable object and a target object.
  • the target object can be positioned in a target direction of the movable object.
  • the processing module 2 can be configured to obtain an inclination angle of the movable object at the target direction, calculate a second distance value from the movable object to the target object based upon the obtained inclination angle and the first distance value, and determine the second distance value as an actual distance between the movable object and the target object.
  • the device can further comprise a control module 3 configured to control a movement of the movable object based upon the actual distance.
  • the target direction can be a dead ahead in a movement direction of the movable object.
  • the movable object can be provided with a monocular or binocular visual sensor, a distance sensor such as an ultrasonic sensor, an infrared sensor, a radar sensor and/or a laser sensor, and other sensors capable of sensing a distance.
  • the detection module 1 can be configured to calculate the first distance value from a sensed data sensed by the visual sensor and/or the distance sensor.
  • the detection module 1 can obtain distances from the movable object to a plurality of feature points on a given plane of the target object. Therefore, the first distance value can be a special value. In some instances, the first distance value can be an average of a plurality of distance values from the movable object to more than one of the plurality of feature points. Optionally, the first distance value can be a distance value from the movable object to a feature point among the plurality of feature points having a most prominent feature, thus the distance value can be the most accurate distance value.
  • the detection module 1 can be configured to detect and obtain the first distance value between the movable object and the target object based upon distance sensing data of a distance sensing.
  • the detection module 1 can be configured to detect and obtain the first distance value between the movable object and the target object based upon an visual sensing data of a visual sensing.
  • the processing module 2 can be configured to obtain an inclination angle from a sensed data of a sensor onboard the movable object, such as a gyroscope or an accelerometer.
  • the inclination angle a can be an intersection angle between a moving direction of the movable object and a horizontal plane. In other words, the inclination angle a can be an intersection angle between the target direction and the horizontal plane.
  • an aerial vehicle can be inclined when the aerial vehicle is travelling forward.
  • the processing module 2 can be configured to sense the inclination and rotation using a sensor onboard the aerial vehicle to obtain the inclination angle.
  • a computation to the first distance value and the inclination angle using trigonometric functions can be performed by the processing module 2 , such that the second distance value can be obtained by correcting the first distance value.
  • the second distance value can be the actual distance between the movable object and the target object.
  • the processing module 2 can be configured to directly provide the actual distance to other controllers or a user end.
  • the processing module 2 can be configured to directly control the movable object, through the control module 3 , to effect a manipulation including an obstacle avoidance, tracking and monitoring based upon the actual distance. For instance, in case the actual distance is less than a distance threshold, a movement of the movable object can be stopped to avoid a crashing onto the target object.
  • a new flight route can be planned to bypass the target object. For instance, in case the actual distance is larger than a distance threshold, a further movement of the movable object can be necessary to keep the target object within the distance threshold to enable a surveillance.
  • the visual sensor or distance sensor can measure a distance from the sensor to the target object. Therefore, a predetermined value can be added to the distance value which is calculated from a sensed data.
  • the distance value calculated from the sensed data can be used directly as the first distance value between the movable object and the target object, and a distance threshold can be configured in view of a fixed value to effect a movement control.
  • a directly detected distance value can be corrected to obtain a more accurate distance value, thereby enabling a more accurate control on a movement of the movable object.
  • Embodiments of the disclosure further provide a structure of another device for detecting distance for a movable object.
  • the device can be provided in a processor.
  • the device can be disposed in a movement controller onboard a movable object, such as a flight controller of an aerial vehicle.
  • the device can comprise the detection module 1 , processing module 2 and control module 3 as discussed hereinabove.
  • the detection module 1 can be configured to detect and obtain a first distance value between the movable object and the target object based upon distance sensing data of a distance sensing.
  • the distance sensor can comprise an ultrasonic distance sensor, a radar distance sensor or an infrared distance sensor.
  • the detection module 1 can be configured to detect and obtain a first distance value between the movable object and the target object based upon visual sensing data of a visual sensing.
  • the visual sensing can comprise a visual sensing system having two cameras.
  • FIG. 9 shows the detection module 1 configured to sense a distance using a visual sensing.
  • the detection module 1 can comprise a capturing unit 11 , a comparing unit 12 and a determining unit 13 .
  • the capturing unit 11 can be configured to capture at least two images in a target direction.
  • the comparing unit 12 can be configured to determine feature points in the at least two images, compare feature points and determine correlated feature points between the at least two images.
  • the determining unit 13 can be configured to calculate a first distance value from the movable object to the target object based upon the determined feature points.
  • the comparing unit 12 can be configured to obtain an inclination angle of the movable object at a target direction, determine an effective area on each of the at least two images based upon the obtained angle, and determine a feature point in the effective area of a each image.
  • the first distance value can be an average of a plurality of distance values between the movable object and the target object.
  • the detection module 1 can be configured to detect the first distance value by a distance sensing process and a visual sensing process.
  • the detection module 1 can be configured to determine which distance sensing process is to be used based upon a distance detected by a distance sensor and/or a distance detected by a visual sensor.
  • FIG. 10 shows the detection module 1 which can comprise a first obtaining unit 14 , a distance value determining unit 15 and second obtaining unit 16 .
  • the first obtaining unit 14 can be configured to obtain distance sensing data sensed by a distance sensing process and calculate a distance value from the distance sensing data.
  • the distance value determining unit 15 can be configured to, if the calculated distance value is less than a preset distance threshold, set the distance value as a first distance value.
  • the second obtaining unit 16 can be configured to, if the calculated distance value is larger than the preset distance threshold, obtain a first distance value between the movable object and the target object by a visual sensing process.
  • control module 3 can be configured to perform an obstacle avoidance based upon the actual distance.
  • the obstacle avoidance can comprise limiting a moving speed or avoiding a target object.
  • control module 3 can be configured to perform a target tracking based upon the actual distance.
  • the target tracking can comprise controlling a movement of the movable object, such that a distance between the movable object and the target object is within a predetermined tracking distance threshold.
  • a directly detected distance value can be corrected to obtain a more accurate distance value, thereby enabling a more accurate control on a movement of the movable object.
  • FIG. 11 shows a structure of an aerial vehicle in accordance with embodiments of the disclosure.
  • the aerial vehicle can comprise a propulsion device 100 , a flight controller 200 and a sensor 300 in addition to various prior art vehicle frames and power supplies.
  • the sensor 300 can be configured to sense distance data from the aerial vehicle to a target object.
  • the flight controller 200 can be configured to determine a first distance value from the aerial vehicle to the target object based upon the sensed distance data of the sensor 300 , the target object being positioned in a target direction of the aerial vehicle, obtain an inclination angle of the aerial vehicle at the target direction, and calculate a second distance value from the aerial vehicle to the target object based upon the obtained inclination angle and the first distance value.
  • the flight controller 200 can be configured to provide a control instruction to the propulsion device 100 to control the propulsion device to effect a movement of the aerial vehicle based upon the second distance value.
  • flight controller 200 and the sensor 300 as discussed in embodiments of the disclosure can be implemented with reference to corresponding methods or functional modules as described in the disclosure shown in FIG. 1 to FIG. 10 .
  • a directly detected distance value can be corrected to obtain a more accurate distance value, thereby enabling a more accurate control on a movement of the movable object.
  • FIG. 12 shows a structure of aerial vehicle in accordance with another embodiment of the disclosure.
  • the aerial vehicle can comprise a propulsion device 400 , a flight controller 500 and a sensor 600 in addition to various prior art vehicle frames and power supplies.
  • the sensor 600 can be configured to detect a first distance value from the aerial vehicle to a target object.
  • the target object can be positioned in a target direction of the movable object.
  • the flight controller 500 can be configured to obtain an inclination angle of the aerial vehicle at the target direction and calculate a second distance value from the aerial vehicle to the target object based upon the obtained inclination angle and the first distance value.
  • the flight controller 500 can be configured to provide a control instruction to the propulsion device 400 to control the propulsion device 400 to effect a movement of the aerial vehicle based upon the second distance value.
  • flight controller 500 and the sensor 600 as discussed in embodiments of the disclosure can be implemented with reference to corresponding methods or functional modules as described in the disclosure shown in FIG. 1 to FIG. 10 .
  • a directly detected distance value can be corrected to obtain a more accurate distance value, thereby enabling a more accurate control on a movement of the movable object.
  • a division of modules or units is merely a division based upon a logical function. Different division can be possible in actual implementation. For example, multiple units or components can be combined or integrated on another system. For example, some features can be ignored or not be performed.
  • a mutual coupling, a direct coupling or a communication connection as shown or discussed can be an indirect coupling or a communication connection via an interface, a means or a unit.
  • the coupling can be an electrical coupling or a mechanical coupling.
  • the units illustrated as separate parts may or may not be physically separated.
  • the parts shown as units may or may not be physical units.
  • the parts can be provided at the same location or distributed over a plurality of network units. All or part of the modules can be selected to implement the embodiments of the disclosure according to actual requirements.
  • Various functional units in the embodiments of the disclosure may be integrated in one processing unit.
  • the functional units can be separate and physical units. Two or more units may be integrated in one unit.
  • the integrated units may be implemented as hardware or software functional units.
  • the integrated units if implemented as software functional units and sold or used as independent products, may be stored in a computer-readable storage medium.
  • all or part of the technical solution may be embodied as a software product.
  • the computer software product is stored in a storage medium and includes several instructions for causing a computer processor to execute entire or part of a method according to the various embodiments of the present disclosure.
  • the above mentioned storage medium includes various media capable of storing program code, such as a U disk, a removable hard disk, ROM (read-only memory), RAM (random access memory), and a diskette, an optical disk .

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Astronomy & Astrophysics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Geometry (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Image Analysis (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

A method for distance detection includes detecting a first distance value between a movable object and a target object in a target direction of the movable object, obtaining an inclination angle of the movable object at the target direction, and calculating a second distance value from the movable object to the target object based upon the inclination angle and the first distance value.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application is a continuation application of U.S. application Ser. No. 15/870, 174, filed on Jan. 12, 2018, which is a continuation application of International Application No. PCT/CN2015/083868 filed on Jul. 13, 2015, the entire contents of both are incorporated herein by reference.
  • TECHNICAL FIELD
  • The present disclosure relates to the technical field of electronics, and more particularly to a method and device for movable object distance detection, and an aerial vehicle.
  • BACKGROUND OF THE DISCLOSURE
  • With the developments in technology, remote control intelligent movable apparatuses, such as remote control robots, remote control cars, unmanned aerial vehicles, can perform reconnaissance and surveillance tasks for a wide variety of military and civilian applications.
  • A distance from the movable apparatus to an object can be detected during a flight of the movable apparatus to detect or track the object in real time. However, the prior detecting distance is not accurate, leading to an inaccurate determination of the movable apparatus and inaccurate detecting or tracking.
  • SUMMARY OF THE DISCLOSURE
  • Embodiments of the present disclosure provide a method and device for movable object distance detection, and an aerial vehicle which enable a more accurate distance detection.
  • An aspect of the disclosure provides a method for movable object distance detection, the method comprising detecting a first distance value between the movable object and a target object, the target object being positioned in a target direction of the movable object; obtaining an inclination angle of the movable object at the target direction, and calculating a second distance value from the movable object to the target object based upon the inclination angle and the first distance value; and determining the second distance value as an actual distance between the movable object and the target object.
  • In some embodiments, the method can further comprise controlling a movement of the movable object based upon the second distance value.
  • In some embodiments, detecting a first distance value between the movable object and the target object can comprise obtaining the first distance value between the movable object and the target object based upon distance sensing data of a distance sensing.
  • In some embodiments, detecting a first distance value between the movable object and the target object can comprise obtaining the first distance value between the movable object and the target object based upon visual sensing data of a visual sensing.
  • In some embodiments, obtaining the first distance value between the movable object and the target object based upon visual sensing data of a visual sensing can comprise obtaining at least two images in the target direction; determining feature points in each of the at least two images, comparing feature points and determining correlated feature points between the at least two images; and calculating the first distance value between the movable object and the target object based upon the correlated feature points.
  • In some embodiments, determining feature points in each of the at least two images can comprise obtaining an inclination angle of the movable object at the target direction; and determining an effective area in each of the at least two images based upon the inclination angle, and determining a feature point in an effective area of each of the at least two images.
  • In some embodiments, the correlated feature points can be a group of sparse feature points, and the calculating the first distance value between the movable object and the target object based upon the correlated feature points can comprise calculating the first distance value between the movable object and the target object based upon the sparse feature points.
  • In some embodiments, the correlated feature points can be a group of dense feature points, and the calculating the first distance value between the movable object and the target object based upon the correlated feature points can comprise calculating the first distance value between the movable object and the target object based upon the dense feature points.
  • In some embodiments, the first distance value can be an average value of a plurality of distance values between the movable object and the target object.
  • In some embodiments, the detecting a first distance value between the movable object and a target object can comprise: obtaining distance sensing data sensed by a distance sensing process, and calculating a distance value from the distance sensing data; if the calculated distance value is less than a preset distance threshold, determining the calculated distance value as the first distance value; if the calculated distance value is larger than the preset distance threshold, determining a distance value between the movable object and the target object sensed by a visual sensing process as the first distance value.
  • In some embodiments, controlling a movement of the movable object based upon the actual distance can comprise performing an obstacle avoidance based upon the actual distance. The obstacle avoidance can comprise limiting a moving speed or avoiding the target object.
  • In some embodiments, the controlling a movement of the movable object based upon the actual distance can comprise performing a target tracking based upon the actual distance. The target tracking can comprise controlling a movement of the movable object such that a distance between the movable object and the target object can be within a predetermined tracking distance threshold.
  • Another aspect of the disclosure further provides a device for movable object distance detection, the device comprising: a detection module configured to detect a first distance value between the movable object and a target object, the target object being positioned in a target direction of the movable object; and a processing module configured to obtain an inclination angle of the movable object at the target direction, calculate a second distance value from the movable object to the target object based upon the inclination angle and the first distance value, and determine the second distance value as an actual distance between the movable object and the target object.
  • In some embodiments, the device can further comprise a control module configured to control a movement of the movable object based upon the actual distance.
  • In some embodiments, the detection module can be configured to obtain the first distance value between the movable object and the target object based upon distance sensing data of a distance sensing.
  • In some embodiments, the detection module can be configured to obtain the first distance value between the movable object and the target object based upon visual sensing data of a visual sensing.
  • In some embodiments, the detection module can comprise: a capturing unit configured to obtain at least two images in the target direction; a comparing unit configured to determine feature points in each of the at least two images and compare feature points and determine correlated feature points between the at least two images; and a determining unit configured to calculate the first distance value between the movable object and the target object based upon the correlated feature points.
  • In some embodiments, the comparing unit can be configured to obtain an inclination angle of the movable object at the target direction, determine an effective area in each of the at least two images based upon the inclination angle, and determine a feature point in an effective area of each of the at least two images.
  • In some embodiments, the determined feature points can be a group of sparse feature points. The determining unit can be configured to calculate the first distance value between the movable object and the target object based upon the sparse feature points.
  • In some embodiments, the determined feature points are a group of dense feature points. The determining unit can be configured to calculate the first distance value between the movable object and the target object based upon the dense feature points.
  • In some embodiments, the first distance value can be an average value of a plurality of distance values between the movable object and the target object.
  • In some embodiments, the detection module can comprise: a first obtaining unit configured to obtain distance sensing data sensed by a distance sensing process and calculate a distance value from the distance sensing data; a distance value determining unit configured to, if the calculated distance value is less than a preset distance threshold, set the calculated distance value as a first distance value; and a second obtaining unit configured to, if the calculated distance value is larger than the preset distance threshold, set a distance value between the movable object and the target object sensed by a visual sensing process as the first distance value.
  • In some embodiments, the control module can be configured to perform an obstacle avoidance based upon the actual distance. The obstacle avoidance can comprise limiting a moving speed or avoiding the target object.
  • In some embodiments, the control module can be configured to perform a target tracking based upon the actual distance. The target tracking operation can comprise controlling a movement of the movable object such that a distance between the movable object and the target object can be within a predetermined tracking distance threshold.
  • A yet aspect of the disclosure further provides an aerial vehicle, the aerial vehicle comprising a propulsion device, a flight controller and a sensor. The sensor can be configured to sense distance data from the aerial vehicle to a target object. The flight controller can be configured to determine a first distance value from the aerial vehicle to a target object based upon the distance data of the sensor, the target object being positioned in a target direction of the aerial vehicle, obtain an inclination angle of the aerial vehicle at the target direction, and calculate a second distance value from the aerial vehicle to the target object based upon the inclination angle and the first distance value. The flight controller can be further configured to provide a control instruction to the propulsion device to control the propulsion device to effect a movement the aerial vehicle based upon the second distance value.
  • A still yet aspect of the disclosure further provides an aerial vehicle, the aerial vehicle comprising a propulsion device, a flight controller and a sensor. The sensor can be configured to detect a first distance value from the aerial vehicle to a target object, the target object being positioned in a target direction of the movable object. The flight controller can be configured to obtain an inclination angle of the aerial vehicle at the target direction, and calculate a second distance value from the aerial vehicle to the target object based upon the inclination angle and the first distance value. The flight controller can be further configured to provide a control instruction to the propulsion device to control the propulsion device to effect a movement of the aerial vehicle based upon the second distance value.
  • Embodiments of the disclosure can correct a directly measured distance value to obtain a more accurate distance value, thereby enabling an accurate movement control of the movable object.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 shows a flowchart of a method for movable object distance detection in accordance with an embodiment of the disclosure.
  • FIG. 2 shows a schematic view of determining a distance between a movable object and a target object in accordance with an embodiment of the disclosure.
  • FIG. 3 shows a flowchart of a method for movable object distance detection in accordance with another embodiment of the disclosure.
  • FIG. 4 shows a flowchart of a method of measuring a distance value in accordance with an embodiment of the disclosure.
  • FIG. 5 shows a structure of a sensor assembly sensing a distance in accordance with an embodiment of the disclosure.
  • FIG. 6 shows a distance measurement using a binocular camera in accordance with an embodiment of the disclosure.
  • FIG. 7 shows a calculation in distance measurement using a binocular camera in accordance with an embodiment of the disclosure.
  • FIG. 8 shows a structure of a device for detecting distance for a movable object in accordance with an embodiment of the disclosure.
  • FIG. 9 shows a structure of a detection module in accordance with an embodiment of the disclosure.
  • FIG. 10 shows a structure of a detection module in accordance with another embodiment of the disclosure.
  • FIG. 11 shows a structure of an aerial vehicle in accordance with an embodiment of the disclosure.
  • FIG. 12 shows a structure of an aerial vehicle in accordance with another embodiment of the disclosure.
  • DETAILED DESCRIPTION OF THE DISCLOSURE
  • A better understanding of the disclosure will be obtained by reference to the following detailed description that sets forth illustrative embodiments with reference to the drawings. It will be apparent that, the embodiments described herein are merely some embodiments of the disclosure. Those skilled in the art can conceive various embodiments in light of those embodiments disclosed herein without inventive efforts, and all these embodiments are within the scope of the disclosure.
  • With embodiments of the disclosure, in detecting a distance between a movable object and a target object, a directly detected distance can be corrected with an inclination angle of the movable object. Therefore, a more accurate distance value can be obtained, enabling an accurate movement control including an obstacle avoidance and a tracking based on the distance value.
  • FIG. 1 shows a flowchart of a method for movable object distance detection in accordance with an embodiment of the disclosure. The method can be performed by a processor. In some instances, the method can be performed by a movement controller onboard a movable object, such as a flight controller onboard an unmanned aerial vehicle. In some embodiments, the method can comprise steps S101 to S103.
  • In step S101, a process of detecting a first distance value between the movable object and a target object can be performed. The target object can be positioned in a target direction of the movable object.
  • The target direction can be a dead ahead in a movement of the movable object. For instance, the target direction can be a direction A (e.g., a flight direction) as shown in FIG. 2. The movable object can be provided with a monocular or binocular visual sensor, a distance sensor such as an ultrasonic sensor, an infrared sensor, a radar sensor and/or a laser sensor, and other sensor capable of sensing a distance. The first distance value D1 can be calculated from a sensed data provided by the visual sensor and/or distance sensor as discussed hereinabove.
  • The visual sensor or distance sensor can be configured to sense and calculate a distance values to an area, and distances from the movable object to a plurality of feature points on a given plane of the target object can be obtained. Therefore, the first distance value D1 can be a special value. In some instances, the first distance value D1 can be an average of distance values from the movable object to a plurality of points. Optionally, the first distance value D1 can be a distance value from the movable object to a feature point among a plurality having a most prominent feature, thus the distance value can be the most accurate distance value.
  • In some instances, the first distance value D1 between the movable object and the target object can be detected and obtained based upon distance sensing data of a distance sensing. Optionally, the first distance value D1 between the movable object and the target object can be detected and obtained based upon an visual sensing data of a visual sensing.
  • In step S102, a process of obtaining an inclination angle of the movable object at the target direction and calculating a second distance value from the movable object to the target object based upon the obtained inclination angle and the first distance value can be performed.
  • The inclination angle a can be measured using a sensor onboard the movable object, such as a gyroscope or an accelerometer. The inclination angle a can be an intersection angle between a moving direction of the movable object and a horizontal plane. In other words, the inclination angle a can be an intersection angle between the target direction and the horizontal plane. For example, an aerial vehicle can be inclined as shown in FIG. 2 when the aerial vehicle is travelling forward. The inclination and rotation can be sensed by a sensor onboard the aerial vehicle to obtain the inclination angle a.
  • Once the first distance value D1 and the inclination angle a are obtained, a computation can be performed to the first distance value D1 and the inclination angle using trigonometric functions, such that the second distance value D2 can be obtained by correcting the first distance value D1.
  • In step S103, a process of determining the second distance value as an actual distance between the movable object and the target object can be performed.
  • Once the actual distance between the movable object and the target object is determined, the actual distance value can be directly provided to a connected controller, such that the controller can direct a movement of the movable object based upon the actual distance value. Optionally, the actual distance can be provided to a user, such that the user can view the actual distance.
  • In some instances, a movement of the movable object can be controlled based upon the actual distance. Once obtaining the actual distance D2 which is corrected and thus more accurate, the movable object can be controlled to effect a manipulation including an obstacle avoidance and a tracking based upon the actual distance D2. For instance, in case the actual distance D2 is less than a distance threshold, a movement of the movable object can be stopped to avoid a crashing onto the target object. Optionally, a new flight route can be planned to bypass the target object. For instance, in case the actual distance D2 is larger than a distance threshold, a further movement of the movable object can be necessary to keep the target object within the distance threshold to enable a surveillance.
  • It will be appreciated that, the visual sensor or distance sensor can measure a distance from the sensor to the target object. Therefore, a predetermined value (e.g., a value D) can be added to the distance value which is calculated from a sensed data. Optionally, the distance value calculated from the sensed data can be used directly as the first distance value D1 between the movable object and the target object, and a distance threshold can be configured in view of a fixed value to effect a movement control.
  • With embodiments of the disclosure, a directly detected distance value can be corrected to obtain a more accurate distance value, thereby enabling a more accurate control on a movement of the movable object.
  • FIG. 3 shows a flowchart of a method for movable object distance detection in accordance with another embodiment of the disclosure. The method can be performed by a processor. In some instances, the method can be performed by a movement controller onboard a movable object, such as a flight controller onboard an unmanned aerial vehicle. In some embodiments, the method can comprise steps S301 to S306.
  • In step S301, a process of obtaining distance sensing data by a distance sensing process and processing the distance sensing data to calculate a distance value can be performed.
  • In some instances, the distance sensing process can comprise effecting a distance sensing using a sensed data of a distance sensor. The distance sensor can comprise an ultrasonic sensor, an infrared sensor, a radar sensor and/or a laser sensor.
  • In step S302, a process of determining whether the calculated distance value is less than a preset distance threshold can be performed.
  • In some instances, the distance threshold can be set based upon an effective detection range of a distance sensor such as an ultrasonic sensor. The effective detection range of a general ultrasound can be about 3 meters. The preset distance threshold can be set as 3 meters if an ultrasound is used as the distance sensor for sensing a distance. A distance value sensed by the ultrasonic sensor can be set as a first distance value if the distance value sensed by the ultrasonic sensor is less than 3 meters. That is, a step S303 can be performed. Otherwise, a step S304 in which a visual sensor is used can be performed if the distance value sensed by the ultrasonic sensor is not less than 3 meters.
  • In step S303, a process of determining the distance value sensed by the ultrasonic sensor as a first distance value can be performed if the calculated distance value is less than the preset distance threshold.
  • In step S304, a process of obtaining a distance value between the movable object and the target object by a visual sensing process and determining the distance value as the first distance value can be performed if the calculated distance value is larger than the preset distance threshold.
  • In step S305, a process of obtaining an inclination angle of the movable object at a target direction and calculating a second distance value from the movable object to the target object based upon the obtained inclination angle and the first distance value can be performed.
  • The inclination angle can be measured using a sensor onboard the movable object, such as a gyroscope or an accelerometer. The inclination angle can be an intersection angle between a moving direction of the movable object and a horizontal plane. In other words, the inclination angle can be an intersection angle between the target direction and the horizontal plane. For example, an aerial vehicle can be inclined as shown in FIG. 2 when the aerial vehicle is travelling forward. The inclination and rotation can be sensed by a sensor onboard the aerial vehicle to obtain the inclination angle.
  • Once the first distance value and the inclination angle are obtained, a computation can be performed to the first distance value and the inclination angle using trigonometric functions, such that the second distance value can be obtained by correcting the first distance value . For example, a formula D2=cos(a)*D1 can be used, where D2 being the second distance value, D1 being the first distance value, and a being the inclination angle.
  • In step S306, a process of determining the second distance value as an actual distance between the movable object and the target object can be performed.
  • In some instances, a movement of the movable object can be controlled based upon the actual distance. Controlling the movement of the movable object based upon the actual distance can comprise performing an obstacle avoidance based upon the actual distance. For instance, the obstacle avoidance can comprise limiting a moving speed or avoiding a target object.
  • With the obtained actual distance available, a flight speed of the movable object can be slowed down until reaching a predefined minimum speed if the movable object getting closer to the target object. When the movable object moves at a high speed, a current moving speed of the movable object and a distance for stopping the movable object from the current moving speed (also referred to as a “stopping distance”) can be known. When the movable object detects a target object in the front and if a distance to the target object is shorter than a distance threshold that is a sum of the stopping distance and a safe distance, the movable object can enter into an emergent braking state to prevent avoid crashing onto the obstacle, e.g., the target object. If the distance to the target object is shorter than the safe distance (and hence is shorter than the distance threshold), the aircraft, e.g., the movable object can automatically retreat to a position outside the safe distance. A speed of the retreating can be faster if the distance to the detected target object is smaller.
  • Controlling the movement of the movable object based upon the actual distance can further comprise tracking a target based upon the actual distance. In some instances, tracking a target can comprise controlling a movement of the movable object such that a distance between the movable object and the target object is within a predetermined tracking distance threshold.
  • FIG. 4 shows a flowchart of a method of measuring a distance value in accordance with an embodiment of the disclosure. In some embodiments, the first distance value as discussed hereinabove can be calculated using a binocular visual sensor. The method can comprise steps S401 to S403.
  • In step S401, a process of obtaining at least two images in a target direction can be performed.
  • FIG. 5 shows a configuration of a sensor assembly capable of sensing a distance which can comprise two visual sensors (e.g., a camera 501 and a camera 502). In addition, FIG. 5 shows distance sensors such as an ultrasonic probe 503 and an ultrasonic probe 504. The sensor assembly can be provided on each surface of the movable object, such that a distance value from the movable object to the target object in every possible moving direction of the movable object can be measured.
  • A pixel can be identified in an image captured by camera 502 corresponding to a pixel in an image captured by camera 501, and vice versa. The first distance value can be calculated from two corresponding pixels, coordinates of the pixels in respect image, a focal length of each camera and a distance between the two cameras.
  • In step S402, a process of determining feature points in each of the at least two images, comparing the feature points and determining correlated feature points between the at least two images can be performed.
  • Pixels having prominent characteristics can be extracted from images captured by the binocular visual. A process of extracting a feature point can be implemented using an algorithm such as FAST (features from accelerated segment test), SIFT (scale-invariant feature transform) or SURF (speeded up robust features). For instance, a feature point extracted by the FAST algorithm can calculate using a brief (binary robust independent elementary features) operator.
  • Correlated feature points between two images can be determined by a binocular feature point matching. For instance, correlated feature points can be determined by comparing a degree of similarity of corresponding pixels between two images using the brief operators of the feature points, which brief operator being calculated using the FAST algorithm. The correlated feature points can comprise pixels of a measured point of the target object in two binocular visual images.
  • In step S403, a process of calculating a first distance value from the movable object to the target object based upon the determined correlated feature points can be performed.
  • Once the correlated feature points are determined, a distance value to each feature point can be calculated based upon triangulation, and the first distance value can then be obtained. FIG. 6 and FIG. 7 show a theory of binocular visual distance measurement. A distance value from the movable object to a measured point P can be calculated as Z=fT/(x1−xr), where Z being the distance value from the movable object to a measured point P, f being a focal length of a camera, T being a distance between two cameras, and x1, xr being the coordinates of the measured point P projected on two images (e.g., the coordinates of the correlated feature points on respect image). The first distance value can be obtained by averaging the distance values obtained from a plurality of measured points,.
  • In some instances, areas in the two images, in which areas the correlated feature points are positioned, can be specified to reduce a computation in determining the correlated feature points. In some instances, the process of determining feature points in the at least two images can comprise obtaining an inclination angle of the movable object at a target direction, determining an effective area in each of the at least two images based upon the obtained angle and determining a feature point in the effective area of each image.
  • The inclination angle of the movable object can be obtained using an IMU (inertial measurement unit). A camera can point downward with a movement of the movable object, thus an image captured by the camera can comprise an area including a dead head area of the movable object and an area below the dead head area of the movable object. The dead head area of the movable object can be positioned at an upper portion of the image captured by the camera. Once the inclination angle is measured by the IMU, a proportion of the dead head area of the movable object in the image can be estimated, such that the effective area can be determined for further calculation. If the camera points upward with the movable object, then the dead head area of the movable object can be positioned at a lower portion of the image captured by the camera. The effective area can be similarly determined based upon the inclination angle measured by the IMU.
  • In some instances, a group of relatively sparse feature points can be obtained using the FAST algorithm. The first distance value from the movable object to the target object can be calculated based upon the determined sparse feature points.
  • Optionally, a group of dense feature points can be obtained by a block matching in combination of speckled filter. The first distance value from the movable object to the target object can be calculated based upon the determined dense feature points.
  • With embodiments of the disclosure, a directly detected distance value can be corrected to obtain a more accurate distance value, thereby enabling a more accurate control on a movement of the movable object.
  • A detailed description will be provided to a device for detecting distance for a movable object and an aerial vehicle in accordance with embodiments of the disclosure.
  • FIG. 8 shows a structure of a device for detecting distance for a movable object in accordance with an embodiment of the disclosure. In some instances, the device can be provided in a processor. Optionally, the device can be provided in movement controller onboard a movable object, such as a flight controller of an aerial vehicle. In some embodiments, the device can comprise detection module 1 and a processing module 2.
  • The detection module 1 can be configured to detect a first distance value between the movable object and a target object. The target object can be positioned in a target direction of the movable object.
  • The processing module 2 can be configured to obtain an inclination angle of the movable object at the target direction, calculate a second distance value from the movable object to the target object based upon the obtained inclination angle and the first distance value, and determine the second distance value as an actual distance between the movable object and the target object.
  • In some instances, the device can further comprise a control module 3 configured to control a movement of the movable object based upon the actual distance.
  • The target direction can be a dead ahead in a movement direction of the movable object. The movable object can be provided with a monocular or binocular visual sensor, a distance sensor such as an ultrasonic sensor, an infrared sensor, a radar sensor and/or a laser sensor, and other sensors capable of sensing a distance. The detection module 1 can be configured to calculate the first distance value from a sensed data sensed by the visual sensor and/or the distance sensor.
  • The detection module 1 can obtain distances from the movable object to a plurality of feature points on a given plane of the target object. Therefore, the first distance value can be a special value. In some instances, the first distance value can be an average of a plurality of distance values from the movable object to more than one of the plurality of feature points. Optionally, the first distance value can be a distance value from the movable object to a feature point among the plurality of feature points having a most prominent feature, thus the distance value can be the most accurate distance value.
  • In some instances, the detection module 1 can be configured to detect and obtain the first distance value between the movable object and the target object based upon distance sensing data of a distance sensing. Optionally, the detection module 1 can be configured to detect and obtain the first distance value between the movable object and the target object based upon an visual sensing data of a visual sensing.
  • The processing module 2 can be configured to obtain an inclination angle from a sensed data of a sensor onboard the movable object, such as a gyroscope or an accelerometer. The inclination angle a can be an intersection angle between a moving direction of the movable object and a horizontal plane. In other words, the inclination angle a can be an intersection angle between the target direction and the horizontal plane. For example, an aerial vehicle can be inclined when the aerial vehicle is travelling forward. The processing module 2 can be configured to sense the inclination and rotation using a sensor onboard the aerial vehicle to obtain the inclination angle.
  • Once the first distance value and the inclination angle are obtained, a computation to the first distance value and the inclination angle using trigonometric functions can be performed by the processing module 2, such that the second distance value can be obtained by correcting the first distance value. The second distance value can be the actual distance between the movable object and the target object.
  • Once obtaining the corrected and more accurate actual distance between the movable object and the target object, the processing module 2 can be configured to directly provide the actual distance to other controllers or a user end. Optionally, the processing module 2 can be configured to directly control the movable object, through the control module 3, to effect a manipulation including an obstacle avoidance, tracking and monitoring based upon the actual distance. For instance, in case the actual distance is less than a distance threshold, a movement of the movable object can be stopped to avoid a crashing onto the target object. Optionally, a new flight route can be planned to bypass the target object. For instance, in case the actual distance is larger than a distance threshold, a further movement of the movable object can be necessary to keep the target object within the distance threshold to enable a surveillance.
  • It will be appreciated that, the visual sensor or distance sensor can measure a distance from the sensor to the target object. Therefore, a predetermined value can be added to the distance value which is calculated from a sensed data. Optionally, the distance value calculated from the sensed data can be used directly as the first distance value between the movable object and the target object, and a distance threshold can be configured in view of a fixed value to effect a movement control.
  • With embodiments of the disclosure, a directly detected distance value can be corrected to obtain a more accurate distance value, thereby enabling a more accurate control on a movement of the movable object.
  • Embodiments of the disclosure further provide a structure of another device for detecting distance for a movable object. In some instances, the device can be provided in a processor. Optionally, the device can be disposed in a movement controller onboard a movable object, such as a flight controller of an aerial vehicle. In some embodiments, the device can comprise the detection module 1, processing module 2 and control module 3 as discussed hereinabove.
  • In some embodiments, the detection module 1 can be configured to detect and obtain a first distance value between the movable object and the target object based upon distance sensing data of a distance sensing. The distance sensor can comprise an ultrasonic distance sensor, a radar distance sensor or an infrared distance sensor.
  • In some embodiments, the detection module 1 can be configured to detect and obtain a first distance value between the movable object and the target object based upon visual sensing data of a visual sensing. The visual sensing can comprise a visual sensing system having two cameras.
  • In some embodiments, FIG. 9 shows the detection module 1 configured to sense a distance using a visual sensing. The detection module 1 can comprise a capturing unit 11, a comparing unit 12 and a determining unit 13.
  • The capturing unit 11 can be configured to capture at least two images in a target direction.
  • The comparing unit 12 can be configured to determine feature points in the at least two images, compare feature points and determine correlated feature points between the at least two images.
  • The determining unit 13 can be configured to calculate a first distance value from the movable object to the target object based upon the determined feature points.
  • In some instances, the comparing unit 12 can be configured to obtain an inclination angle of the movable object at a target direction, determine an effective area on each of the at least two images based upon the obtained angle, and determine a feature point in the effective area of a each image.
  • In some instances, the first distance value can be an average of a plurality of distance values between the movable object and the target object.
  • In some instances, the detection module 1 can be configured to detect the first distance value by a distance sensing process and a visual sensing process.
  • In some instances, the detection module 1 can be configured to determine which distance sensing process is to be used based upon a distance detected by a distance sensor and/or a distance detected by a visual sensor. FIG. 10 shows the detection module 1 which can comprise a first obtaining unit 14, a distance value determining unit 15 and second obtaining unit 16.
  • The first obtaining unit 14 can be configured to obtain distance sensing data sensed by a distance sensing process and calculate a distance value from the distance sensing data.
  • The distance value determining unit 15 can be configured to, if the calculated distance value is less than a preset distance threshold, set the distance value as a first distance value.
  • The second obtaining unit 16 can be configured to, if the calculated distance value is larger than the preset distance threshold, obtain a first distance value between the movable object and the target object by a visual sensing process.
  • In some instances, the control module 3 can be configured to perform an obstacle avoidance based upon the actual distance. For instance, the obstacle avoidance can comprise limiting a moving speed or avoiding a target object.
  • In some instances, the control module 3 can be configured to perform a target tracking based upon the actual distance. For instance, the target tracking can comprise controlling a movement of the movable object, such that a distance between the movable object and the target object is within a predetermined tracking distance threshold.
  • It will be appreciated that, the functional modules and units as discussed in embodiments of the disclosure can be implemented with reference to corresponding methods or functional modules as described in the disclosure.
  • With embodiments of the disclosure, a directly detected distance value can be corrected to obtain a more accurate distance value, thereby enabling a more accurate control on a movement of the movable object.
  • FIG. 11 shows a structure of an aerial vehicle in accordance with embodiments of the disclosure. The aerial vehicle can comprise a propulsion device 100, a flight controller 200 and a sensor 300 in addition to various prior art vehicle frames and power supplies.
  • The sensor 300 can be configured to sense distance data from the aerial vehicle to a target object.
  • The flight controller 200 can be configured to determine a first distance value from the aerial vehicle to the target object based upon the sensed distance data of the sensor 300, the target object being positioned in a target direction of the aerial vehicle, obtain an inclination angle of the aerial vehicle at the target direction, and calculate a second distance value from the aerial vehicle to the target object based upon the obtained inclination angle and the first distance value.
  • In some instances, the flight controller 200 can be configured to provide a control instruction to the propulsion device 100 to control the propulsion device to effect a movement of the aerial vehicle based upon the second distance value.
  • It will be appreciated that, the flight controller 200 and the sensor 300 as discussed in embodiments of the disclosure can be implemented with reference to corresponding methods or functional modules as described in the disclosure shown in FIG. 1 to FIG. 10.
  • With embodiments of the disclosure, a directly detected distance value can be corrected to obtain a more accurate distance value, thereby enabling a more accurate control on a movement of the movable object.
  • FIG. 12 shows a structure of aerial vehicle in accordance with another embodiment of the disclosure. The aerial vehicle can comprise a propulsion device 400, a flight controller 500 and a sensor 600 in addition to various prior art vehicle frames and power supplies.
  • The sensor 600 can be configured to detect a first distance value from the aerial vehicle to a target object. The target object can be positioned in a target direction of the movable object.
  • The flight controller 500 can be configured to obtain an inclination angle of the aerial vehicle at the target direction and calculate a second distance value from the aerial vehicle to the target object based upon the obtained inclination angle and the first distance value.
  • In some instances, the flight controller 500 can be configured to provide a control instruction to the propulsion device 400 to control the propulsion device 400 to effect a movement of the aerial vehicle based upon the second distance value.
  • It will be appreciated that, the flight controller 500 and the sensor 600 as discussed in embodiments of the disclosure can be implemented with reference to corresponding methods or functional modules as described in the disclosure shown in FIG. 1 to FIG. 10.
  • With embodiments of the disclosure, a directly detected distance value can be corrected to obtain a more accurate distance value, thereby enabling a more accurate control on a movement of the movable object.
  • It will be appreciated that, the relevant device and method disclosed in embodiments of the disclosure can be implemented in other manners. For example, the described device embodiments are merely illustrative. For example, a division of modules or units is merely a division based upon a logical function. Different division can be possible in actual implementation. For example, multiple units or components can be combined or integrated on another system. For example, some features can be ignored or not be performed. For example, a mutual coupling, a direct coupling or a communication connection as shown or discussed can be an indirect coupling or a communication connection via an interface, a means or a unit. The coupling can be an electrical coupling or a mechanical coupling.
  • The units illustrated as separate parts may or may not be physically separated. The parts shown as units may or may not be physical units. For example, the parts can be provided at the same location or distributed over a plurality of network units. All or part of the modules can be selected to implement the embodiments of the disclosure according to actual requirements.
  • Various functional units in the embodiments of the disclosure may be integrated in one processing unit. The functional units can be separate and physical units. Two or more units may be integrated in one unit. The integrated units may be implemented as hardware or software functional units.
  • The integrated units, if implemented as software functional units and sold or used as independent products, may be stored in a computer-readable storage medium. With such an understanding, all or part of the technical solution may be embodied as a software product. The computer software product is stored in a storage medium and includes several instructions for causing a computer processor to execute entire or part of a method according to the various embodiments of the present disclosure. The above mentioned storage medium includes various media capable of storing program code, such as a U disk, a removable hard disk, ROM (read-only memory), RAM (random access memory), and a diskette, an optical disk .
  • The foregoing embodiments are intended to merely illustrate rather than limit the scope of the present disclosure. Numerous equivalent structures or equivalent flow variations made in light of the specification and the accompanying drawings of the present disclosure, whether directly or indirectly applied to other related technical art, are within the scope of the present disclosure.

Claims (20)

what is claimed is:
1. A method for detecting distance between an aerial vehicle and a target object, comprising:
acquiring, using one or more first sensors configured on the aerial vehicle, a first distance value corresponding to a distance between the aerial vehicle and the target object in a target direction of the aerial vehicle, the one or more first sensors including one or more of a distance sensor and a visual sensor;
acquiring, using a second sensor configured on the aerial vehicle, an inclination angle of the aerial vehicle, the inclination angle being an intersection angle between a moving direction of the aerial vehicle and a horizontal plane, and the second sensor being an inclination sensor;
calculating, by a processor, a second distance value from the aerial vehicle to the target object based upon the inclination angle and the first distance value; and
controlling, by a flight controller configured on the aerial vehicle, a movement of the aerial vehicle based upon the second distance value.
2. The method of claim 1, wherein:
the first sensors include two visual sensors; and
acquiring the first distance value between the aerial vehicle and the target object includes acquiring the first distance value using the two visual sensors.
3. The method of claim 2, wherein acquiring the first distance value between the aerial vehicle and the target object comprises:
controlling the two visual sensors to acquire at least two images in the target direction;
determining, by the processor, feature points in each of the at least two images;
performing, by the processor, feature point comparison and determining correlated feature points between the at least two images; and
calculating, by the processor, the first distance value between the aerial vehicle and the target object based upon the correlated feature points.
4. The method of claim 3, wherein determining the feature points in each of the at least two images comprises:
determining, by the processor, an effective area in each of the at least two images based upon the inclination angle; and
determining, by the processor, the feature points in the effective area of each of the at least two images.
5. The method of claim 3, wherein:
the correlated feature points include a group of sparse feature points, and
calculating the first distance value between the aerial vehicle and the target object based upon the correlated feature points includes calculating the first distance value between the aerial vehicle and the target object based upon the sparse feature points.
6. The method of claim 3, wherein:
the correlated feature points include a group of dense feature points, and
calculating the first distance value between the aerial vehicle and the target object based upon the correlated feature points comprises calculating the first distance value between the aerial vehicle and the target object based upon the dense feature points.
7. The method of claim 1, wherein detecting the first distance value between the aerial vehicle and a target object comprises:
acquiring distance sensing data sensed from the distance sensor, and calculating a distance value from the distance sensing data;
if the calculated distance value is less than a preset distance threshold, determining the calculated distance value as the first distance value; and
if the calculated distance value is larger than the preset distance threshold, determining a distance value between the aerial vehicle and the target object sensed by a visual sensing process as the first distance value.
8. The method of claim 1, wherein controlling the movement of the aerial vehicle based upon the second distance comprises performing an obstacle avoidance based upon the second distance, the obstacle avoidance comprising limiting a moving speed of the aerial vehicle or avoiding the target object.
9. The method of claim 8, wherein controlling the movement of the aerial vehicle based upon the second distance comprises performing a target tracking based upon the second distance, the target tracking comprising controlling the movement of the aerial vehicle such that a distance between the aerial vehicle and the target object is within a predetermined tracking distance threshold.
10. The method of claim 8, wherein controlling the movement of the aerial vehicle based upon the second distance comprises:
in response to the second distance value being below a distance threshold, controlling the aerial vehicle to enter an emergent braking state, the distance threshold being a sum of the stopping distance and a safe distance.
11. A device for distance detection comprising:
a processor; and
a memory storing instructions that, when executed by the processor, cause the processor to:
control one or more of first sensors configured on an aerial vehicle to acquire a first distance value corresponding to a distance between the aerial vehicle and a target object in a target direction of the aerial vehicle, the one or more first sensors including one or more of a distance sensor and a visual sensor;
control a second sensor configured on the aerial vehicle to acquire an inclination angle of the aerial vehicle at the target direction, the inclination angle being an intersection angle between a moving direction of the aerial vehicle and a horizontal plane, and the second sensor being an inclination sensor;
calculate a second distance value from the aerial vehicle to the target object based upon the inclination angle and the first distance value; and
control a movement of the aerial vehicle based upon the second distance value.
12. The device of claim 11, wherein:
the first sensors include two visual sensors; and
the instructions further cause the processor to acquire the first distance value using the two visual sensors.
13. The device of claim 12, wherein the instructions further cause the processor to:
control the two visual sensors to acquire at least two images in the target direction;
determine feature points in each of the at least two images;
perform feature point comparison and determine correlated feature points between the at least two images; and
calculate the first distance value between the aerial vehicle and the target object based upon the correlated feature points.
14. The device of claim 13, wherein the instructions further cause the processor to:
determine an effective area in each of the at least two images based upon the inclination angle; and
determine the feature points in the effective area of each of the at least two images.
15. The device of claim 13, wherein:
the determined feature points include a group of sparse feature points, and
the instructions further cause the processor to calculate the first distance value between the aerial vehicle and the target object based upon the sparse feature points.
16. The device of claim 13, wherein:
the determined feature points include a group of dense feature points, and
the instructions further cause the processor to calculate the first distance value between the aerial vehicle and the target object based upon the dense feature points.
17. The device of claim 11, wherein the instructions further cause the processor to:
acquire distance sensing data sensed by the distance sensor, and calculate a distance value from the distance sensing data;
if the calculated distance value is less than a preset distance threshold, determine the calculated distance value as the first distance value; and
if the calculated distance value is larger than the preset distance threshold, determine a distance value between the aerial vehicle and the target object sensed by a visual sensing process as the first distance value.
18. The device of claim 11, wherein the instructions further cause the processor to:
control a movement of the aerial vehicle based upon the second distance value.
19. The device of claim 18, wherein the instructions further cause the processor to perform an obstacle avoidance based upon the second distance, the obstacle avoidance comprising limiting a moving speed of the aerial vehicle or avoiding the target object.
20. An aerial vehicle comprising:
a propulsion device;
one or more first sensors configured to acquire a first distance value corresponding to a distance between the aerial vehicle and the target object in a target direction of the aerial vehicle, the one or more first sensors including one or more of a distance sensor and a visual sensor;
a second sensor configured to acquire an inclination angle of the aerial vehicle, the inclination angle being an intersection angle between a moving direction of the aerial vehicle and a horizontal plane, and the second sensor being an inclination sensor;
a processor configured to calculate a second distance value from the aerial vehicle to the target object based upon the inclination angle and the first distance value; and
a flight controller configured to control a movement of the aerial vehicle based upon the second distance value.
US16/817,205 2015-07-13 2020-03-12 Method and device for movable object distance detection, and aerial vehicle Abandoned US20200208970A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/817,205 US20200208970A1 (en) 2015-07-13 2020-03-12 Method and device for movable object distance detection, and aerial vehicle

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
PCT/CN2015/083868 WO2017008224A1 (en) 2015-07-13 2015-07-13 Moving object distance detection method, device and aircraft
US15/870,174 US10591292B2 (en) 2015-07-13 2018-01-12 Method and device for movable object distance detection, and aerial vehicle
US16/817,205 US20200208970A1 (en) 2015-07-13 2020-03-12 Method and device for movable object distance detection, and aerial vehicle

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/870,174 Continuation US10591292B2 (en) 2015-07-13 2018-01-12 Method and device for movable object distance detection, and aerial vehicle

Publications (1)

Publication Number Publication Date
US20200208970A1 true US20200208970A1 (en) 2020-07-02

Family

ID=57216277

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/870,174 Active 2035-11-05 US10591292B2 (en) 2015-07-13 2018-01-12 Method and device for movable object distance detection, and aerial vehicle
US16/817,205 Abandoned US20200208970A1 (en) 2015-07-13 2020-03-12 Method and device for movable object distance detection, and aerial vehicle

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US15/870,174 Active 2035-11-05 US10591292B2 (en) 2015-07-13 2018-01-12 Method and device for movable object distance detection, and aerial vehicle

Country Status (3)

Country Link
US (2) US10591292B2 (en)
CN (1) CN106104203B (en)
WO (1) WO2017008224A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220391610A1 (en) * 2021-06-07 2022-12-08 Goodrich Corporation Land use for target prioritization

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017008224A1 (en) * 2015-07-13 2017-01-19 深圳市大疆创新科技有限公司 Moving object distance detection method, device and aircraft
WO2018090205A1 (en) * 2016-11-15 2018-05-24 SZ DJI Technology Co., Ltd. Method and system for image-based object detection and corresponding movement adjustment maneuvers
CN106774413A (en) * 2016-12-30 2017-05-31 易瓦特科技股份公司 It is applied to the method and system of flight avoidance
CN106842177B (en) * 2016-12-30 2021-03-09 易瓦特科技股份公司 Method and device for acquiring standard obstacle avoidance distance
CN106774407A (en) * 2016-12-30 2017-05-31 易瓦特科技股份公司 Barrier-avoiding method and device
CN106871906B (en) * 2017-03-03 2020-08-28 西南大学 Navigation method and device for blind person and terminal equipment
WO2018201358A1 (en) * 2017-05-03 2018-11-08 深圳市元征科技股份有限公司 Intelligent survey device control method and device and storage medium
JP6795457B2 (en) * 2017-06-02 2020-12-02 本田技研工業株式会社 Vehicle control systems, vehicle control methods, and vehicle control programs
WO2018227350A1 (en) * 2017-06-12 2018-12-20 深圳市大疆创新科技有限公司 Control method for homeward voyage of unmanned aerial vehicle, unmanned aerial vehicle and machine-readable storage medium
CN109213187A (en) * 2017-06-30 2019-01-15 北京臻迪科技股份有限公司 A kind of displacement of unmanned plane determines method, apparatus and unmanned plane
CN108184096B (en) * 2018-01-08 2020-09-11 北京艾恩斯网络科技有限公司 Panoramic monitoring device, system and method for airport running and sliding area
WO2019144286A1 (en) * 2018-01-23 2019-08-01 深圳市大疆创新科技有限公司 Obstacle detection method, mobile platform, and computer readable storage medium
KR102061514B1 (en) * 2018-03-02 2020-01-02 주식회사 만도 Apparatus and method for detecting objects
CN108820215B (en) * 2018-05-21 2021-10-01 南昌航空大学 Automatic air-drop unmanned aerial vehicle capable of automatically searching target
CN109031197B (en) * 2018-06-19 2019-07-09 哈尔滨工业大学 A kind of direct localization method of radiation source based on over-the-horizon propagation model
CN112567201B (en) * 2018-08-21 2024-04-16 深圳市大疆创新科技有限公司 Distance measuring method and device
CN109202903B (en) * 2018-09-13 2021-05-28 河南机电职业学院 Method for calibrating countsPerMeter parameter and base standard system of sorting robot workstation conveying chain
CN108873001B (en) * 2018-09-17 2022-09-09 江苏金智科技股份有限公司 Method for accurately judging positioning accuracy of robot
US20200159252A1 (en) * 2018-11-21 2020-05-21 Eagle View Technologies, Inc. Navigating unmanned aircraft using pitch
CN109435886B (en) * 2018-12-05 2021-02-23 广西农业职业技术学院 Auxiliary detection method for safe driving of automobile
CN109866230A (en) * 2019-01-17 2019-06-11 深圳壹账通智能科技有限公司 Customer service robot control method, device, computer equipment and storage medium
CN113474819A (en) * 2019-03-27 2021-10-01 索尼集团公司 Information processing apparatus, information processing method, and program
WO2021195886A1 (en) * 2020-03-30 2021-10-07 深圳市大疆创新科技有限公司 Distance determination method, mobile platform, and computer-readable storage medium
US11946771B2 (en) 2020-04-01 2024-04-02 Industrial Technology Research Institute Aerial vehicle and orientation detection method using same
WO2021237535A1 (en) * 2020-05-27 2021-12-02 深圳市大疆创新科技有限公司 Collision processing method and device, and medium

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101968354A (en) * 2010-09-29 2011-02-09 清华大学 Laser detection and image identification based unmanned helicopter distance measuring method
EP2511781A1 (en) * 2011-04-14 2012-10-17 Hexagon Technology Center GmbH Method and system for controlling an unmanned aircraft
FR2985329B1 (en) * 2012-01-04 2015-01-30 Parrot METHOD FOR INTUITIVE CONTROL OF A DRONE USING A REMOTE CONTROL APPARATUS
US9497380B1 (en) * 2013-02-15 2016-11-15 Red.Com, Inc. Dense field imaging
CN106546257B (en) * 2013-04-16 2019-09-13 合肥杰发科技有限公司 Vehicle distance measurement method and device, vehicle relative velocity measurement method and device
WO2015157883A1 (en) * 2014-04-17 2015-10-22 SZ DJI Technology Co., Ltd. Flight control for flight-restricted regions
CN104656665B (en) * 2015-03-06 2017-07-28 云南电网有限责任公司电力科学研究院 A kind of general avoidance module of new unmanned plane and step
CN104730533A (en) * 2015-03-13 2015-06-24 陈蔼珊 Mobile terminal, and ranging method and system based on mobile terminal
WO2016205415A1 (en) * 2015-06-15 2016-12-22 ImageKeeper LLC Unmanned aerial vehicle management
WO2017008224A1 (en) * 2015-07-13 2017-01-19 深圳市大疆创新科技有限公司 Moving object distance detection method, device and aircraft
WO2017017675A1 (en) * 2015-07-28 2017-02-02 Margolin Joshua Multi-rotor uav flight control method and system
US10017237B2 (en) * 2015-12-29 2018-07-10 Qualcomm Incorporated Unmanned aerial vehicle structures and methods
KR20170138797A (en) * 2016-06-08 2017-12-18 엘지전자 주식회사 Drone
JP2018146546A (en) * 2017-03-09 2018-09-20 エアロセンス株式会社 Information processing system, information processing device, and information processing method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220391610A1 (en) * 2021-06-07 2022-12-08 Goodrich Corporation Land use for target prioritization
US11810346B2 (en) * 2021-06-07 2023-11-07 Goodrich Corporation Land use for target prioritization

Also Published As

Publication number Publication date
CN106104203A (en) 2016-11-09
US10591292B2 (en) 2020-03-17
WO2017008224A1 (en) 2017-01-19
US20180156610A1 (en) 2018-06-07
CN106104203B (en) 2018-02-02

Similar Documents

Publication Publication Date Title
US20200208970A1 (en) Method and device for movable object distance detection, and aerial vehicle
US20210124029A1 (en) Calibration of laser and vision sensors
US11392146B2 (en) Method for detecting target object, detection apparatus and robot
KR102109941B1 (en) Method and Apparatus for Vehicle Detection Using Lidar Sensor and Camera
EP3280976B1 (en) Object position measurement with automotive camera using vehicle motion data
US11961208B2 (en) Correction of motion-based inaccuracy in point clouds
KR102032070B1 (en) System and Method for Depth Map Sampling
US11151741B2 (en) System and method for obstacle avoidance
CA2975139A1 (en) Stereo camera system for collision avoidance during aircraft surface operations
US20180365839A1 (en) Systems and methods for initialization of target object in a tracking system
EP3306529A1 (en) Machine control measurements device
JPWO2017145541A1 (en) Moving body
JP2018048949A (en) Object recognition device
US11346670B2 (en) Position estimating device
US11423560B2 (en) Method for improving the interpretation of the surroundings of a vehicle
KR20160125803A (en) Apparatus for defining an area in interest, apparatus for detecting object in an area in interest and method for defining an area in interest
US11645762B2 (en) Obstacle detection
KR102316012B1 (en) Apparatus and method for determining possibility of collision with flying object in front of drone using camera image provided in drone
WO2021223166A1 (en) State information determination method, apparatus and system, and movable platform and storage medium
JP2020076714A (en) Position attitude estimation device
JP2021051524A (en) Work machine peripheral detection object position detection system, and work machine peripheral detection object position detection program
JP6564682B2 (en) Object detection device, object detection method, and object detection program
JP2021154857A (en) Operation support device, operation support method, and program

Legal Events

Date Code Title Description
AS Assignment

Owner name: SZ DJI TECHNOLOGY CO., LTD., CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ZHU, ZHENYU;CHEN, ZIHAN;ZHANG, ZHIYUAN;AND OTHERS;REEL/FRAME:052101/0207

Effective date: 20180112

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STCB Information on status: application discontinuation

Free format text: EXPRESSLY ABANDONED -- DURING EXAMINATION