WO2017080108A1 - 飞行装置、飞行控制***及方法 - Google Patents

飞行装置、飞行控制***及方法 Download PDF

Info

Publication number
WO2017080108A1
WO2017080108A1 PCT/CN2016/071016 CN2016071016W WO2017080108A1 WO 2017080108 A1 WO2017080108 A1 WO 2017080108A1 CN 2016071016 W CN2016071016 W CN 2016071016W WO 2017080108 A1 WO2017080108 A1 WO 2017080108A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
flying device
height
offset
scene
Prior art date
Application number
PCT/CN2016/071016
Other languages
English (en)
French (fr)
Inventor
李佐广
Original Assignee
深圳市道通智能航空技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市道通智能航空技术有限公司 filed Critical 深圳市道通智能航空技术有限公司
Publication of WO2017080108A1 publication Critical patent/WO2017080108A1/zh
Priority to US15/625,225 priority Critical patent/US10234873B2/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/83Electronic components structurally integrated with aircraft elements, e.g. circuit boards carrying loads
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Definitions

  • the present invention relates to a method, and more particularly to a flight control method, a flight control system, and a flight device for controlling a flight device.
  • speed detection and/or positioning control of the drone is a key technology.
  • the speed detection and/or positioning control of the drone is mostly located by GPS (global positioning system).
  • GPS global positioning system
  • Speed detection and/or positioning control cannot be performed on the drone.
  • the current speed detection and/or positioning control of the UAV is mostly performed by an algorithm based on a common scene.
  • the algorithm based on the common scene often Causes inaccurate positioning.
  • the present invention provides a flight device and a flight control method, which can perform speed detection and positioning control on a flight device without relying on GPS.
  • the present invention provides the following technical solutions.
  • the present invention provides a flight control system for controlling a flight device, the flight control system comprising: an acquisition module for acquiring an image acquired by a binocular camera module of the flight device; and a scene determination module, configured to: Determining a scene in which the flying device is currently located; a height determining module for determining a height of the flying device according to depth information of an image acquired by the binocular camera module; and an image offset determining module for collecting according to the binocular camera module Calculating an image X offset and an image Y offset of the second frame image of the two adjacent images relative to the first frame image by using two adjacent images and a scene currently in which the flying device is located; a calibration module, configured to acquire acceleration and angular velocity of the flying device detected by the acceleration sensor of the flight device in a three-dimensional direction, and compensate the offset of the image X and Y according to the acceleration and angular velocity of the flying device.
  • An image correction offset including a corrected image X offset and an image Y offset; and a speed calculation module for passing the binocular camera module
  • the lens focal length, the height of the flying device, and the image correction offset calculate an X, Y offset of the world coordinate corresponding to the image correction offset, and according to the time interval of the two frames of adjacent image acquisition and The X, Y offset of the world coordinates is used to determine the speed of the flying device.
  • the height determining module determines the height of the flying device according to the depth information of the image captured by the binocular camera module.
  • the height determining module obtains the camera module by calibrating the two cameras of the binocular camera module.
  • the internal parameters including the focal length, the image center, the distortion coefficient, and the external parameters including the rotation matrix and the translation matrix; the height determination module corrects the binoculars, then performs stereo matching, obtains a disparity map, and performs three-dimensional reconstruction to obtain
  • the depth information is grayed out and normalized to the range [0, 255] to obtain the height.
  • the flying device further includes a distance sensor for detecting a distance between the flying device and the ground, and the height determining module is configured to determine the camera according to the scene in which the flying device is currently located.
  • the depth of field information of the image acquired by the module is used to derive the height of the flying device or to select the distance detected by the distance sensor as the height of the flying device.
  • the scene determination module determines whether the scene in which the flying device is located is a texture rich scene or a texture less scene according to at least one parameter in an image captured by the camera module, the parameters including but not limited to texture.
  • the height determining module determines whether the height of the flying device is determined according to the depth information of the image acquired by the camera module according to whether the scene is a texture rich scene and a maximum area of the same texture, or the selected distance sensor detects The distance is the height of the flying device.
  • the height determining module determines that the maximum area area AM of the depth image near depth value is greater than the minimum value SMIN is less than the maximum value SMAX, the height determining module selects to collect according to the camera module of the camera module.
  • the depth of field information of the image determines the height of the flying device, wherein the minimum value SMIN is 1/4 of the maximum area of the image, and the maximum value SMAX is set to 3/4 of the maximum area of the image.
  • the height determining module determines that the depth image near depth value maximum area area AM is greater than the maximum value SMAX, and the area corresponding to the depth information of the image acquired by the camera module corresponds to the area The difference between the height and the height measured by the distance sensor exceeds a threshold, and the depth of field information calculated from the image acquired by the camera module is selected to determine the height of the flying device, wherein the maximum value SMAX is 3/4 of the maximum area of the image.
  • the height determination module determines the depth image in a texture rich scene If the maximum depth area AM is smaller than the minimum value SMIN or greater than the maximum value SMAX, the height determination module selects the depth information of the image acquired by the camera module to determine the height of the flying device or the distance measured by the distance sensor as the height of the flying device, wherein The minimum value SMIN is 1/4 of the maximum area of the image, and the maximum value SMAX is 3/4 of the maximum area of the image.
  • the height determination module uses the distance measured by the distance sensor as the height of the flying device.
  • the present invention provides a flight control method for controlling a flight device, the flight control method comprising: acquiring an image acquired by a binocular camera module of the flight device; determining a scene in which the flight device is currently located; The depth of field information of the image captured by the binocular camera module determines the height of the flying device; and calculates the second of the two adjacent images according to the two adjacent images captured by the binocular camera module and the scene in which the flying device is currently located Acquiring an image X offset and an image Y offset of the second frame image relative to the first frame image; acquiring acceleration and angular velocity of the flying device detected by the acceleration sensor of the flight device in a three-dimensional direction, and according to the flight device Acceleration and angular velocity compensate the image X offset and the image Y offset to obtain an image correction offset including the corrected image X offset and the image Y offset; and a lens through the camera module
  • the focal length, the height of the flying device, and the image correction offset amount calculate an X, Y offset of the flight
  • the step of “determining the height of the flying device according to the depth of field information of the image captured by the binocular camera module” includes: obtaining the camera module by calibrating the two cameras of the binocular camera module. Internal parameters including focal length, image center, distortion coefficient, and external parameters including rotation matrix and translation matrix; correction of binocular; stereo matching, obtaining disparity map; and 3D reconstruction to obtain depth information, depth map Grayscale is performed and normalized to the range [0, 255] to obtain the height.
  • the method further comprises: determining, according to a scene in which the flying device is currently located, depth of field information according to an image acquired by the camera module to derive a height of the flying device or selecting a distance detected by the distance sensor as a flight The height of the device.
  • the step of “determining a scene in which the flying device is currently located” includes determining whether the scene in which the flying device is located is a texture rich scene or a texture less scene according to at least one parameter in an image captured by the camera module.
  • the parameters include, but are not limited to, textures.
  • the height includes: in the rich texture scene, if the maximum area area AM of the image is greater than the minimum value SMIN is less than the maximum value SMAX, then the depth of field information of the image acquired by the camera of the camera module is selected to determine the height of the flying device, wherein the minimum value SMIN is 1/4 of the maximum area of the image, and the maximum value SMAX is set to 3/4 of the maximum area of the image.
  • the height includes: in the rich texture scene, if the maximum area area AM of the depth image close to the depth value is greater than the maximum value SMAX, and the height and distance sensor corresponding to the area obtained by the depth of field information of the image captured by the camera module The difference in height exceeds a threshold, and the depth of field information calculated from the image acquired by the camera module is selected to determine the height of the flying device, wherein the maximum value SMAX is 3/4 of the maximum area of the image.
  • the height includes: in the texture rich scene, if the height determining module determines that the maximum depth area AM of the depth image is smaller than the minimum value SMIN or greater than the maximum value SMAX, selecting the depth information of the image captured by the camera module to determine the flying device
  • the height or the distance measured by the distance sensor is taken as the height of the flying device, wherein the minimum value SMIN is 1/4 of the maximum area of the image, and the maximum value SMAX is 3/4 of the maximum area of the image.
  • the height includes: the distance measured by the distance sensor as the height of the flying device in a scene with no rich texture.
  • a flying device including a binocular camera module, a distance sensor, an acceleration sensor, and a flight control system, the binocular camera module including two cameras for respectively acquiring images, the distance sensor And acquiring the height of the flying device for detecting the acceleration and angular velocity of the flying device in a three-dimensional direction, the flight control system comprising: an acquiring module, configured to acquire a binocular camera module of the flying device a captured image; a scene determining module, configured to determine a scene in which the flying device is currently located; a height determining module configured to determine a height of the flying device according to depth information of an image acquired by the binocular camera module; and an image offset determining module, Calculating an image X offset of the second frame image of the two adjacent images relative to the first frame image according to the two adjacent images acquired by the binocular camera module and the scene where the flying device is currently located And an image Y offset; an offset calibration module for acquiring a flying device detected by an acceleration sensor of the flying device Acceleration
  • the height determining module determines the height of the flying device according to the depth information of the image captured by the binocular camera module.
  • the height determining module obtains the camera module by calibrating the two cameras of the binocular camera module.
  • the internal parameters including the focal length, the image center, the distortion coefficient, and the external parameters including the rotation matrix and the translation matrix; the height determination module corrects the binoculars, then performs stereo matching, obtains a disparity map, and performs three-dimensional reconstruction to obtain
  • the depth information is grayed out and normalized to the range [0, 255] to obtain the height.
  • the flying device further includes a distance sensor for detecting a distance between the flying device and the ground, and the height determining module is configured to determine the camera according to the scene in which the flying device is currently located.
  • the depth of field information of the image acquired by the module is used to derive the height of the flying device or to select the distance detected by the distance sensor as the height of the flying device.
  • the scene determination module determines whether the scene in which the flying device is located is a texture rich scene or a texture less scene according to at least one parameter in an image captured by the camera module, the parameters including but not limited to texture.
  • the height determining module determines whether the height of the flying device is determined according to the depth information of the image acquired by the camera module according to whether the scene is a texture rich scene and a maximum area of the same texture, or the selected distance sensor detects The distance is the height of the flying device.
  • the height determining module determines that the maximum area area AM of the depth image near depth value is greater than the minimum value SMIN is less than the maximum value SMAX, the height determining module selects to collect according to the camera module of the camera module.
  • the depth of field information of the image determines the height of the flying device, wherein the minimum value SMIN is 1/4 of the maximum area of the image, and the maximum value SMAX is set to 3/4 of the maximum area of the image.
  • the height determining module determines that the depth image near depth value maximum area area AM is greater than the maximum value SMAX, and the area corresponding to the depth information of the image acquired by the camera module corresponds to the area The difference between the height and the height measured by the distance sensor exceeds a threshold, and the depth of field information calculated from the image acquired by the camera module is selected to determine the height of the flying device, wherein the maximum value SMAX is 3/4 of the maximum area of the image.
  • the height determining module determines that the depth image close depth value maximum area area AM is less than the minimum value SMIN or greater than the maximum value SMAX, the height determining module selects an image captured by the camera module.
  • the depth of field information determines the height of the flying device or the distance measured by the distance sensor as the height of the flying device, wherein the minimum value SMIN is 1/4 of the maximum area of the image, and the maximum value SMAX is 3/4 of the maximum area of the image.
  • the height determination module uses the distance measured by the distance sensor as the height of the flying device.
  • the invention has the beneficial effects that speed detection and positioning control can still be performed when the GPS signal is weak or no GPS signal, and precise control can be performed based on different scenarios.
  • FIG. 1 is a schematic diagram of a hardware architecture of a flying device according to an embodiment of the present invention.
  • FIG. 2 is a block diagram of a flight control system in accordance with an embodiment of the present invention.
  • Fig. 3 is a schematic explanatory diagram showing the X and Y offset amounts of world coordinates in an embodiment of the present invention.
  • FIG. 4 is a schematic diagram showing the relationship between the X and Y offsets of the world coordinates and the image correction offset amount according to an embodiment of the present invention.
  • FIG. 5 is a flowchart of a flight control method according to an embodiment of the present invention.
  • FIG. 1 is a schematic diagram of a hardware architecture of a flying device 100 in an embodiment.
  • the flying device 100 includes a processor 10, a camera module 20, and an acceleration sensor 30.
  • the camera module 20 is configured to acquire images every predetermined time interval, for example, to capture images every 1 second.
  • the camera module 20 is a binocular camera module, and includes two cameras 21 and 22, and the camera module 20 respectively acquires an image through the two cameras 21 and 22.
  • the acceleration sensor 30 is used to detect the acceleration and angular velocity of the flying device 100 in the three-dimensional direction (X, Y, Z).
  • the acceleration sensor 30 can be a gyroscope
  • the distance sensor 40 can be an ultrasonic sensor.
  • the processor 10 operates a flight control system S1.
  • the flight control system S1 includes an acquisition module 11, a scene determination module 12, a height determination module 13, an image offset determination module 14, an offset calibration module 15, a speed calculation module 16, and an operation control module 17. .
  • the flight control system S1 is used to detect the speed of the flying device 100 and perform positioning control.
  • the modules of the flight control system S1 can be programmed and executed by the processor 10.
  • the modules of the flight control system S1 can also be firmware that is solidified in the processor 10.
  • the flight control system S1 can be an application software installed in the flight device 100.
  • the acquisition module 11 is configured to acquire an image captured by the camera module 20 .
  • the acquisition module 11 acquires an image acquired by the camera module 20 in real time.
  • the scene determination module 12 is configured to determine the scene in which the flying device 100 is currently located. Specifically, the scene The determination module 12 determines the scene in which the flying device 100 is located based on at least one parameter characteristic in the image.
  • the height determining module 13 is configured to calculate depth of field information according to images acquired by the cameras 21 and 22 of the camera module 20 acquired by the fly acquiring module 11 and determine the height of the flying device 100 according to the depth of field information.
  • the image offset determining module 14 is configured to calculate the two frames adjacent to each other according to the two adjacent images captured by the camera module 20 acquired by the acquiring module 11 and the scene currently determined by the scene determining module 12 The image X offset and the image Y offset of the second frame image in the image relative to the first frame image.
  • the image offset determining module 14 acquires two adjacent images that are continuously collected by any camera in the camera module 20 .
  • the image offset determining module 14 calculates the second frame image of the two adjacent images according to the algorithm corresponding to the scene in which the flying device 100 is located, and analyzes the change of the parameters in the two frames of images.
  • the image offset determining module 14 may determine the same feature point in the two adjacent images, and calculate an X, Y offset of the same feature point in the two adjacent images. The X and Y offsets of the image are obtained. The same feature point is an imaging point of the same object in the two adjacent images, and the X and Y offset of the image is in the X direction and the Y direction of the same object in the image of the two frames. Offset.
  • the offset calibration module 15 is configured to acquire the acceleration and angular velocity of the flying device 100 detected by the acceleration sensor 30 in a three-dimensional direction, and compensate the offset of the image X and Y according to the acceleration and angular velocity of the flying device 100. An image correction offset is obtained.
  • the speed calculation module 16 is configured to calculate an X, Y offset of the world coordinate corresponding to the image correction offset by the lens focal length, the height of the flying device 100, and the image correction offset, that is, the actual X corresponding to the real world, Y offset.
  • X and Y of the present invention refer to the horizontal axis direction and the vertical axis direction on a plane parallel to the ground in the three-dimensional coordinate system, respectively.
  • the X, Y offset of the world coordinates is the moving distance of the flying device 100 / camera module 20 relative to the ground in the X and Y directions.
  • each of the cameras 21, 22 of the camera module 20 includes a lens 201 and an image sensor 202.
  • the flying device 100 moves to the upper right in a time interval of capturing two adjacent images P1, P2, causing the actual object A to be imaged on the image sensor 202 by the lens 201 of the flying device 100, the actual object A
  • the image forming point A1 in the two adjacent images P1, P2 will have an offset to the lower left as shown in FIG.
  • the image X offset and the image Y offset are obtained, and the image X offset and the image Y offset are corrected by the compensation to be the image correction offset. Therefore, the image correction offset amount has a certain correspondence relationship with the X and Y offsets of the world coordinates, and the X and Y offset amounts of the world coordinates can be obtained based on the image correction offset amount or the like.
  • the camera module 20 can be a camera, a camera, a camera, or the like.
  • the image correction offset is the actual distance of the object A in the two-frame adjacent image P1, P2 corresponding to the X-direction and the Y-direction on the image sensor 202.
  • FIG. 4 Please refer to FIG. 4 as a diagram showing the relationship between the X and Y offset of a world coordinate and the image correction offset.
  • the lens focal length be f
  • the height of the flying device 100 be H
  • the corrected image X offset in the image correction offset be x1
  • the corrected image Y offset be y1
  • the world coordinate X offset be The X1, Y offset is Y1.
  • the ratio of the X or Y offset of the world coordinates to the corrected image X offset or corrected image Y offset is equal to the ratio of the lens focal length to f and height H.
  • the speed calculation module 16 determines the speed of the flying device based on the time interval t1 of the two adjacent image acquisitions and the X and Y offsets of the world coordinates.
  • the X, Y offset of the world coordinates is the moving distance of the flying device in the X and Y directions during the time interval between the two adjacent image acquisitions, and the camera module 20 is configured to collect the two frames.
  • the time interval of the adjacent image is t1
  • the speed calculation module 16 calculates, according to the time interval t1 and the X, Y offset of the world coordinate, that the speed of the flying device in the X direction is X1/t1 and in the Y direction.
  • the rate is Y1/t1.
  • the speed calculation module 16 first determines the vector of the X, Y offset of the world coordinates and derives the actual displacement D1 of the flying device 100, and then calculates the actual speed of the flying device 100 as D1/t1.
  • the operational control module 16 is for positioning and/or hovering control of the flying device 100 based at least on the speed of the flying device 100. For example, the operational control module 16 calculates the time required based on the speed of the flying device 100 and the distance between the flying device 100 and the destination, and prepares for hovering or landing when the required time is less than a predetermined value. In an embodiment, the operation control module 16 determines that the currently calculated speed and direction are substantially equal to the speed calculated at the previous time but in the opposite direction, determining that the flight speed of the flying device 100 is close to 0, and the moving distance is very small, such as 1 cm. Distance of movement, Then, the flying device 100 is controlled to hover at a certain position.
  • the speed of the flying device 100 can be calculated from the captured picture, and the positioning control can be performed.
  • the flying device 100 is an unmanned aerial vehicle.
  • the scene determining module 12 automatically determines the scene in which the flying device 100 is currently located according to the image acquired by the camera module 20 . In other embodiments, the scene determining module 12 may also determine that the scene selected by the user is the scene in which the flying device 100 is currently located according to the scene selected by the user.
  • the flying device 100 further includes a distance sensor 40 for detecting the distance between the flying device 100 and the ground, that is, for detecting the flying device 100. height.
  • the height determining module 13 is configured to determine, according to a scenario in which the flying device 100 is currently located, the depth of field information calculated according to the acquired image of the camera module 20 to determine the height of the flying device 100 or select the detected distance sensor 40. The distance is the height of the flying device 100.
  • the scene determining module 12 determines the type of the scene according to at least one parameter feature in the image collected by the camera module 20.
  • the at least one parameter includes a texture
  • the scene determining module 12 processes the image by using a sobel gradient algorithm to obtain a gradient matrix, and the number C of pixels in the gradient matrix that is greater than the threshold T1, and the value C is greater than the threshold.
  • T2 thinks that the texture is rich, otherwise the texture is considered to be less.
  • the at least one parameter further includes a reflection
  • the scene determining module 12 performs shadow statistics according to the continuous multi-frame image to determine whether there is a reflection, thereby determining that the scene is a scene with reflection or a scene without reflection.
  • the scene determining module 12 determines that there is a dark and bright situation in the continuous multi-frame image or performs a drone light reflection detection
  • the detection is performed according to the shape of the drone light, and it is determined that there is no match.
  • the shape of the lamp is determined, the scene is determined to be a scene with reflections.
  • the scene determining module performs the determining according to the reflection detecting algorithm.
  • the circular light of the drone will display a grayscale image with a bright brightness in the image, so the scene determining module 12 determines whether each pixel of the gray image is greater than or equal to the threshold T, wherein T may be set to 220 according to the empirical value setting; the scene determination module 12 sets the gray value pixel of greater than or equal to T to 255, and sets the grayscale value pixel smaller than T to 0, thereby converting the image into binary values.
  • T may be set to 220 according to the empirical value setting
  • the scene determination module 12 sets the gray value pixel of greater than or equal to T to 255, and sets the grayscale value pixel smaller than T to 0, thereby converting the image into binary values.
  • the image where 0 is the background, 255 is the foreground, and then the connected region is extracted, and the circumscribed rectangle is used; the scene determining module 12 performs the target size judgment, and the target size range is considered to be the light reflection target. Where the target size The range is based on measurements of different height lamp reflections.
  • the at least one parameter further includes grayscale
  • the scene determining module 12 converts the image into a grayscale histogram according to the gray value of the image and performs statistics, and compares with the corresponding threshold to detect whether the scene is dark, Normal brightness, or brighter.
  • the scene determining module 12 gives a threshold T according to the average brightness L of the grayscale histogram, and belongs to a dark scene when determining L ⁇ 80 (candela), and L>170 belongs to a bright scene, 80 ⁇ L ⁇ 170 o'clock is a normal brightness scene.
  • the at least one parameter may further include a line
  • the scene determining module 12 performs gradient detection on the image, performs binarization processing, and then uses a conventional hough line detection to determine whether there is a straight line, if it is determined that at least one straight line and the straight line length If the image width is at least 1/2, the scene is judged to be a line rich scene.
  • the scene determining module 12 determines the type of the current scene according to one parameter and a corresponding algorithm. In other embodiments, the scene determination module may also determine the scene according to the plurality of parameters and the plurality of corresponding algorithms. For example, the scene determination module 12 can simultaneously determine whether the current scene is a texture rich and lined scene according to the texture and the line.
  • the height determining module 13 selects the image acquired by the cameras 21 and 22 of the camera module 20 to calculate the depth of field information as the height of the flying device 100 or selects the distance sensor according to whether the scene is a texture rich scene and a maximum area of the same texture.
  • the detected distance is the height of the flying device 100.
  • the height determining module 13 selects the image acquired by the cameras 21 and 22 of the camera module 20 to calculate the depth of field information as the height of the flying device 100 or selects the detected distance sensor 40 according to the scene in which the flying device 100 is located.
  • the distance as the height of the flight device 100 includes the following steps:
  • the height determining module 13 selects the image acquired by the cameras 21 and 22 of the camera module 20 to calculate the depth of field information as the flying device 100.
  • SMIN can be set to 1/4 of the maximum area of the image (image width * image height)
  • SMAX is set to 3/4 of the maximum area of the image.
  • the depth of field information is calculated. If the value exceeds a threshold (for example, 10 CM), the ultrasonic measurement is considered to be inaccurate, the scene ultrasonic measurement is not accurate, and the height determination module 13 selects the depth of field information calculated by the images acquired by the cameras 21 and 22 of the camera module 20 as the flying device 100. the height of.
  • a threshold for example, 10 CM
  • the height determination module 13 uses the height measured by the distance sensor 40 in a texture-less scene.
  • the height measured by the height determining module 13 using the distance sensor 40 also uses the height measured by the distance sensor 40.
  • the calculation of the depth of field information by the height determination module 13 through the images acquired by the cameras 21 and 22 of the camera module 20 includes: by calibrating the two cameras 21 and 22, the purpose of the calibration is to acquire the internal parameters of the camera (focal length, image center, Distortion coefficient, etc.) and external parameter R (rotation) matrix T (translation) matrix) (specific steps are: 1) left camera 21 calibration, obtaining internal and external parameters; 2) right camera 22 calibration to obtain external parameters; 3) dual target, Obtain the translational rotation relationship between the cameras); then correct the binoculars, for example, to correct the distortion; then perform stereo matching to obtain the disparity map; finally, perform 3D (3D) reconstruction to obtain depth information, and grayscale the depth map.
  • the normalized to the range [0, 255] gives the height H.
  • the image offset determining module 14 calculates an image horizontal offset of the second frame image of the two adjacent images from the first frame image according to the two adjacent images and the scene where the flying device 100 is currently located.
  • the image offset determining module 14 calculates, according to an algorithm corresponding to the scene in which the flying device 100 is located, the change of the parameter in the two frames of images, and calculates the second frame image of the two adjacent images relative to the first frame image.
  • Image X, Y offset is
  • the image offset determination module 14 employs a grayscale template matching algorithm.
  • the current image width and height are respectively W and H
  • the position is obtained;
  • the template image is superimposed on the matching graph, and the search subgraph in the reference graph covered by the template graph is S(i, j), i, j is the upper left corner of the subgraph in the matching graph.
  • the positions in S, i and j take values in the range [-4, 4].
  • S(0,0) corresponds to the [4,4] position of A.
  • the correlation function SAD is calculated to find the search subgraph as similar as possible to the template graph and its coordinate positions i and j.
  • the minimum value of SAD of T and S(i, j) is the best matching position, that is, the phase.
  • the relative offset or amount of movement of the adjacent two frames of images in the X and Y directions that is, the X and Y offsets of the images of two adjacent frames, where the offset is [-4, 4] Scope.
  • SAD refers to the process of accumulating and summing the absolute values of pixel differences corresponding to each position of two images. The smaller the SAD value, the higher the matching of the two images, which can be used as the best match.
  • the image offset determination module 14 employs a sobel gradient template matching algorithm. Specifically, the image offset determination module 14 performs edge detection by using a Sobel operator, which uses a two-dimensional template for calculation, and the two-dimensional template includes a horizontal template and a vertical template. As shown in the figure below, the horizontal template is used for horizontal differential operation and the vertical template is used for vertical differential operation.
  • the image shift amount determining module 14 performs a plane convolution operation using the above-described template, and calculates the horizontal direction convolution fx and the vertical direction convolution fy, respectively, and finds the gradient value G as the square root of the sum of the square of fx and the square of fy. Then, gradient operations are performed on the adjacent two images to obtain gradient value matrices A and B. Where A is the Sobel gradient matrix of the upper frame image and B is the Sobel gradient matrix of the current frame image.
  • the template image is superimposed on the matching graph, and the search subgraph in the reference graph covered by the template graph is S(i, j), i, j is the upper left corner of the subgraph in the matching graph.
  • the positions in S, i and j take values in the range [-4, 4].
  • S(0,0) corresponds to the [4,4] position of A.
  • a difference operation method is performed on the gradient value matrices T and S(i, j) to obtain a matrix difference matrix C.
  • the absolute values of each element in the C matrix that satisfy the following conditions (condition 1 and condition 2 below) are accumulated to obtain the sum SS(i, j).
  • A[r,c] is the gradient value of the r and c positions in the gradient value matrix A
  • [r,c] is the gradient value of the r and c positions in the gradient value matrix B
  • T is the gradient threshold.
  • the (i, j) position corresponding to the smallest value of SS(i, j) is the best matching position, that is, the relative offset or movement amount of the adjacent two frames of images in the X and Y directions, that is, two adjacent images.
  • Image X, Y offset where the offset is in the range [-4, 4].
  • the image offset determining module 14 performs histogram equalization on the image when the scene in which the flying device 100 is currently located is a scene with particularly bright or particularly dark illumination, adjusts the brightness of the image to be uniform, and then determines whether it is a texture.
  • a rich scene, and according to the relationship as described above, the image is processed according to whether the texture is rich or not.
  • the image offset determining module 14 removes the shadow in the image when the scene where the flying device 100 is currently located, and removes the shadow, and determines whether the texture is rich in the scene, and according to the relationship as described above. Select the corresponding algorithm according to whether the texture is rich or not.
  • the image offset determination module 14 divides a straight line into a horizontal straight line and a vertical straight line when the scene in which the flying device 100 is currently located is a line rich scene, and finds one in the previous frame image and detects the current image.
  • a line Lcv having the smallest angular difference from the straight line Lcv detected in the vertical direction of the current image is found. Calculate the distance between two lines to get i, set to 0 when i is not in [-R, R].
  • the i, j obtained by linear positioning is the relative offset or movement amount of the adjacent two frames of images in the X and Y directions, that is, the X and Y offsets of the adjacent images of the two frames, wherein the offset is [-4 , 4] range.
  • the speed of the flying device 100 can be determined and the positioning control can be performed when there is no GPS signal, and precise control can be further performed based on different scenarios.
  • FIG. 5 is a flowchart of a flight control method according to an embodiment of the present invention.
  • the flight control method is used to detect the speed of the flying device 100 and perform positioning control.
  • the acquisition module 11 acquires an image acquired by the camera module 20 (501).
  • the scene determination module 12 determines the scene in which the flight device 100 is currently located (503). The scene determining module 12 determines the type of the scene according to at least one parameter feature in the image collected by the camera module 20.
  • the height determination module 13 is configured to determine the height of the flying device 100 based on the depth of field information of the image acquired by the binocular camera module 20 (505).
  • the image offset determining module 14 calculates an image of the second frame image of the two adjacent images relative to the first frame image according to the two adjacent images captured by the camera module 20 and the scene where the flying device 100 is currently located. X, Y offset (505). The image offset determining module 14 determines the same feature points in the two adjacent images, and calculates the X and Y offsets of the same feature points in the adjacent images of the two frames to obtain the image X. Y offset.
  • the scene determining module 12 determines the type of the scene according to at least one parameter in the image collected by the camera module 20, and selects a corresponding algorithm according to the scene in which the flying device 100 is currently located, and according to the algorithm, the two The frame adjacent image is subjected to analysis processing to obtain an image horizontal offset of the second frame image with respect to the first frame image.
  • the offset calibration module 15 acquires the acceleration and angular velocity of the flying device 100 detected by the acceleration sensor 30 in a three-dimensional direction, and compensates the image X and Y offset according to the acceleration and angular velocity of the flying device to obtain an image. Correct the offset (507).
  • the speed calculation module 16 is configured to calculate an X, Y offset of the world coordinate corresponding to the image correction offset by the lens focal length, the height of the flying device 100, and the image correction offset, and obtain the adjacent image according to the two frames.
  • the time interval and the X and Y offsets of the world coordinates determine the speed of the flying device (509). Specifically, the speed calculation module 16 calculates the velocity of the flying device in the X direction and the velocity in the Y direction according to the time interval t1 and the X, Y offset of the world coordinates. More specifically, the speed calculation module 16 derives the speed of the flying device 100 in the X and Y directions, respectively, by dividing the X, Y offset by the time interval.
  • the operational control module 16 positions and/or hoveres the flight device 100 based at least on the speed of the flight device 100 (511).

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Image Analysis (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

一种飞行控制方法,包括:获取飞行装置(100)的相机模组(20)采集的图像;确定飞行装置(100)当前所处的场景;根据采集的图像的景深信息确定飞行装置(100)的高度;根据采集的两帧相邻图像以及飞行装置(100)所处的场景,计算两帧相邻图像的图像X、Y偏移量;对所述图像X、Y偏移量进行补偿得到图像校正偏移量;以及通过相机模组(20)的镜头焦距、高度及图像校正偏移量计算世界坐标的X、Y偏移量,并根据两帧相邻图像采集的时间间隔以及世界坐标的X、Y偏移量求出飞行装置的速度。还提供一种飞行控制***(S1)及飞行装置(100),可在无GPS信号时对飞行装置(100)进行速度侦测及定位控制。

Description

飞行装置、飞行控制***及方法 【技术领域】
本发明涉及一种方法,尤其涉及一种用于控制飞行装置的飞行控制方法、飞行控制***及飞行装置。
【背景技术】
目前,无人机等飞行装置由于其便捷性和安全性,已经广泛应用于农业生产、地质勘测、气象监测、电力线巡查、抢险救灾辅助、视频拍摄、地图绘建等领域。在无人机的控制中,对无人机进行速度侦测及/或定位控制,是一个关键的技术。目前对无人机进行速度侦测及/或定位控制大都通过GPS(global positioning system,全球定位***)进行定位,然而,当无人机处于GPS信号较弱或GPS信号未覆盖的区域时,将无法对无人机进行速度侦测及/或定位控制。此外,目前对无人机进行速度侦测及/或定位控制大都通过基于一通用场景的算法进行,然而,当无人机实际处于场景与该通用场景差别较大时,基于通用场景的算法往往导致无法准确地定位。
【发明内容】
有鉴于此,本发明提供一种飞行装置及飞行控制方法,不依赖GPS即可对飞行装置进行速度侦测及定位控制。
为解决上述技术问题,本发明提供以下技术方案。
一方面,本发明提供一种飞行控制***,用于控制一飞行装置,所述飞行控制***包括:获取模块,用于获取飞行装置的双目相机模组采集的图像;场景确定模块,用于确定飞行装置当前所处的场景;高度确定模块,用于根据双目相机模组采集的图像的景深信息确定飞行装置的高度;图像偏移量确定模块,用于根据双目相机模组采集的两帧相邻图像以及飞行装置当前所处的场景,计算所述两帧相邻图像中的第二帧图像相对于第一帧图像的图像X偏移量及图像Y偏移量;偏移量校准模块,用于获取飞行装置的加速度传感器侦测到的飞行装置在三维方向上的加速度及角速度,并根据所述飞行装置的加速度及角速度对所述图像X、Y偏移量进行补偿而得到包括校正后的图像X偏移量和图像Y偏移量的图像校正偏移量;以及速度计算模块,用于通过双目相机模组 的镜头焦距、飞行装置的高度以及所述图像校正偏移量计算所述图像校正偏移量对应的世界坐标的X、Y偏移量,并根据所述两帧相邻图像采集的时间间隔以及所述世界坐标的X、Y偏移量求出所述飞行装置的速度。
在一些实施例中,高度确定模块根据双目相机模组采集的图像的景深信息确定飞行装置的高度包括:高度确定模块通过对双目相机模组的两个摄像头进行标定,以获取相机模组的包括焦距,图像中心,畸变系数在内的内参和以及包括旋转矩阵和平移矩阵的外参;高度确定模块并对双目进行矫正,然后进行立体匹配,得到视差图,并进行三维重建,得到深度信息,将深度图进行灰度化,归一化到[0,255]范围而得到高度。
在一些实施例中,所述飞行装置还包括一距离传感器,所述距离传感器用于侦测飞行装置与地面的距离,所述高度确定模块用于根据飞行装置当前所处的场景,确定根据相机模组采集的图像的景深信息来得出飞行装置的高度或选择获取距离传感器侦测到的距离作为飞行装置的高度。
在一些实施例中,所述场景确定模块根据相机模组采集的图像中的至少一个参数确定飞行装置所处的场景是否为纹理丰富场景或纹理较少场景,所述参数包括但不限于纹理。
在一些实施例中,所述高度确定模块根据场景是否为纹理丰富场景以及相同纹理的最大区域面积去选择根据相机模组采集的图像的景深信息确定飞行装置的高度或选择获取距离传感器侦测到的距离作为飞行装置的高度。
在一些实施例中,在纹理丰富场景下,所述高度确定模块若判断深度图像相近深度值的最大区域面积AM大于最小值SMIN小于最大值SMAX,则高度确定模块选择根据相机模组的摄像头采集的图像的景深信息确定飞行装置的高度,其中,最小值SMIN为图像最大面积的1/4,最大值SMAX设定为图像最大面积的3/4。
在一些实施例中,在纹理丰富场景下,所述高度确定模块若判断深度图像相近深度值最大区域面积AM大于最大值SMAX,且通过相机模组采集的图像的景深信息得出的该区域对应的高度和距离传感器测量的高度的差值超出一个阈值,则选择相机模组采集的图像计算出的景深信息确定飞行装置的高度,其中,最大值SMAX为图像最大面积的3/4。
在一些实施例中,在纹理丰富场景下,所述高度确定模块若判断深度图像 相近深度值最大区域面积AM小于最小值SMIN或者大于最大值SMAX,则高度确定模块选择相机模组采集的图像的景深信息确定飞行装置的高度或采用距离传感器测量的距离作为飞行装置的高度,其中,最小值SMIN为图像最大面积的1/4,最大值SMAX为图像最大面积的3/4。
在一些实施例中,在纹理不丰富场景下,高度确定模块采用距离传感器测量的距离作为飞行装置的高度。
在一些实施例中,所述速度计算模块根据公式1:x1/X1=f/H计算所述世界坐标的X偏移量,以及根据公式2:y1/Y1=f/H计算所述世界坐标的Y偏移量,其中,x1为校正后的图像X偏移量,y1为校正后的图像Y偏移量,f为镜头焦距,H为飞行装置的高度,X1为世界坐标的X偏移量,Y1为世界坐标的Y偏移量;所述速度计算模块并根据该相机模组采集该两帧相邻图像的时间间隔为以及该世界坐标的X、Y偏移量计算得出飞行装置在X方向上的速率以及在Y方向上的速率。
在另一方面,本发明提供一种飞行控制方法,用于控制飞行装置,所述飞行控制方法包括:获取飞行装置的双目相机模组采集的图像;确定飞行装置当前所处的场景;根据双目相机模组采集的图像的景深信息确定飞行装置的高度;根据双目相机模组采集的两帧相邻图像以及飞行装置当前所处的场景,计算所述两帧相邻图像中的第二帧图像相对于第一帧图像的图像X偏移量和图像Y偏移量;获取飞行装置的加速度传感器侦测到的飞行装置在三维方向上的加速度及角速度,并根据所述飞行装置的加速度及角速度对所述图像X偏移量和图像Y偏移量进行补偿而得到包括校正后的图像X偏移量和图像Y偏移量的图像校正偏移量;以及通过相机模组的镜头焦距、飞行装置的高度以及所述图像校正偏移量计算所述图像校正偏移量对应的世界坐标的X、Y偏移量,并根据所述两帧相邻图像采集的时间间隔以及所述世界坐标的X、Y偏移量求出所述飞行装置的速度。
在一些实施例中,所述步骤“根据双目相机模组采集的图像的景深信息确定飞行装置的高度”包括:通过对双目相机模组的两个摄像头进行标定,以获取相机模组的包括焦距,图像中心,畸变系数在内的内参和以及包括旋转矩阵和平移矩阵的外参;对双目进行矫正;进行立体匹配,得到视差图;以及进行三维重建,得到深度信息,将深度图进行灰度化,归一化到[0,255]范围而得到高度。
在一些实施例中,所述方法还包括:根据飞行装置当前所处的场景,确定根据相机模组采集的图像的景深信息来得出飞行装置的高度或选择获取距离传感器侦测到的距离作为飞行装置的高度。
在一些实施例中,所述步骤“确定飞行装置当前所处的场景”包括:根据相机模组采集的图像中的至少一个参数确定飞行装置所处的场景是否为纹理丰富场景或纹理较少场景,所述参数包括但不限于纹理。
在一些实施例中,所述步骤“根据飞行装置当前所处的场景,确定根据相机模组采集的图像的景深信息来得出飞行装置的高度或选择获取距离传感器侦测到的距离作为飞行装置的高度”包括:在纹理丰富场景下,若图像的最大区域面积AM大于最小值SMIN小于最大值SMAX,则选择根据相机模组的摄像头采集的图像的景深信息确定飞行装置的高度,其中,最小值SMIN为图像最大面积的1/4,最大值SMAX设定为图像最大面积的3/4。
在一些实施例中,所述步骤“根据飞行装置当前所处的场景,确定根据相机模组采集的图像的景深信息来得出飞行装置的高度或选择获取距离传感器侦测到的距离作为飞行装置的高度”包括:在纹理丰富场景下,若判断深度图像相近深度值的最大区域面积AM大于最大值SMAX,且通过相机模组采集的图像的景深信息得出的该区域对应的高度和距离传感器测量的高度的差值超出一个阈值,则选择相机模组采集的图像计算出的景深信息确定飞行装置的高度,其中,最大值SMAX为图像最大面积的3/4。
在一些实施例中,所述步骤“根据飞行装置当前所处的场景,确定根据相机模组采集的图像的景深信息来得出飞行装置的高度或选择获取距离传感器侦测到的距离作为飞行装置的高度”包括:在纹理丰富场景下,所述高度确定模块若判断深度图像相近深度值最大区域面积AM小于最小值SMIN或者大于最大值SMAX,则选择相机模组采集的图像的景深信息确定飞行装置的高度或采用距离传感器测量的距离作为飞行装置的高度,其中,最小值SMIN为图像最大面积的1/4,最大值SMAX为图像最大面积的3/4。
在一些实施例中,所述步骤“根据飞行装置当前所处的场景,确定根据相机模组采集的图像的景深信息来得出飞行装置的高度或选择获取距离传感器侦测到的距离作为飞行装置的高度”包括:在纹理不丰富场景下,采用距离传感器测量的距离作为飞行装置的高度。
在一些实施例中,所述步骤“通过双目相机模组的镜头焦距、飞行装置的 高度以及所述图像校正偏移量计算所述图像校正偏移量对应的世界坐标的X、Y偏移量”包括:根据公式:x1/X1=f/H计算所述世界坐标的X偏移量;以及
根据公式:y1/Y1=f/H计算所述世界坐标的Y偏移量,其中,x1为图像X偏移量,y1为图像Y偏移量,f为镜头焦距,H为飞行装置的高度,X1为世界坐标的X偏移量,Y1为世界坐标的Y偏移量。
另一方面,提供一种飞行装置,包括双目相机模组、距离传感器、加速度传感器以及飞行控制***,所述双目相机模组包括两个摄像头,用于分别采集图像,所述距离传感器用于以及获取飞行装置的高度,所述加速度传感器用于侦测到的飞行装置在三维方向上的加速度及角速度,所述飞行控制***包括:获取模块,用于获取飞行装置的双目相机模组采集的图像;场景确定模块,用于确定飞行装置当前所处的场景;高度确定模块,用于根据双目相机模组采集的图像的景深信息确定飞行装置的高度;图像偏移量确定模块,用于根据双目相机模组采集的两帧相邻图像以及飞行装置当前所处的场景,计算所述两帧相邻图像中的第二帧图像相对于第一帧图像的图像X偏移量及图像Y偏移量;偏移量校准模块,用于获取飞行装置的加速度传感器侦测到的飞行装置在三维方向上的加速度及角速度,并根据所述飞行装置的加速度及角速度对所述图像X、Y偏移量进行补偿而得到包括校正后的图像X偏移量和图像Y偏移量的图像校正偏移量;以及速度计算模块,用于通过双目相机模组的镜头焦距、飞行装置的高度以及所述图像校正偏移量计算所述图像校正偏移量对应的世界坐标的X、Y偏移量,并根据所述两帧相邻图像采集的时间间隔以及所述世界坐标的X、Y偏移量求出所述飞行装置的速度。
在一些实施例中,高度确定模块根据双目相机模组采集的图像的景深信息确定飞行装置的高度包括:高度确定模块通过对双目相机模组的两个摄像头进行标定,以获取相机模组的包括焦距,图像中心,畸变系数在内的内参和以及包括旋转矩阵和平移矩阵的外参;高度确定模块并对双目进行矫正,然后进行立体匹配,得到视差图,并进行三维重建,得到深度信息,将深度图进行灰度化,归一化到[0,255]范围而得到高度。
在一些实施例中,所述飞行装置还包括一距离传感器,所述距离传感器用于侦测飞行装置与地面的距离,所述高度确定模块用于根据飞行装置当前所处的场景,确定根据相机模组采集的图像的景深信息来得出飞行装置的高度或选择获取距离传感器侦测到的距离作为飞行装置的高度。
在一些实施例中,所述场景确定模块根据相机模组采集的图像中的至少一个参数确定飞行装置所处的场景是否为纹理丰富场景或纹理较少场景,所述参数包括但不限于纹理。
在一些实施例中,所述高度确定模块根据场景是否为纹理丰富场景以及相同纹理的最大区域面积去选择根据相机模组采集的图像的景深信息确定飞行装置的高度或选择获取距离传感器侦测到的距离作为飞行装置的高度。
在一些实施例中,在纹理丰富场景下,所述高度确定模块若判断深度图像相近深度值的最大区域面积AM大于最小值SMIN小于最大值SMAX,则高度确定模块选择根据相机模组的摄像头采集的图像的景深信息确定飞行装置的高度,其中,最小值SMIN为图像最大面积的1/4,最大值SMAX设定为图像最大面积的3/4。
在一些实施例中,在纹理丰富场景下,所述高度确定模块若判断深度图像相近深度值最大区域面积AM大于最大值SMAX,且通过相机模组采集的图像的景深信息得出的该区域对应的高度和距离传感器测量的高度的差值超出一个阈值,则选择相机模组采集的图像计算出的景深信息确定飞行装置的高度,其中,最大值SMAX为图像最大面积的3/4。
在一些实施例中,在纹理丰富场景下,所述高度确定模块若判断深度图像相近深度值最大区域面积AM小于最小值SMIN或者大于最大值SMAX,则高度确定模块选择相机模组采集的图像的景深信息确定飞行装置的高度或采用距离传感器测量的距离作为飞行装置的高度,其中,最小值SMIN为图像最大面积的1/4,最大值SMAX为图像最大面积的3/4。
在一些实施例中,在纹理不丰富场景下,高度确定模块采用距离传感器测量的距离作为飞行装置的高度。
在一些实施例中,所述速度计算模块根据公式1:x1/X1=f/H计算所述世界坐标的X偏移量,以及根据公式2:y1/Y1=f/H计算所述世界坐标的Y偏移量,其中,x1为校正后的图像X偏移量,y1为校正后的图像Y偏移量,f为镜头焦距,H为飞行装置的高度,X1为世界坐标的X偏移量,Y1为世界坐标的Y偏移量;所述速度计算模块并根据该相机模组采集该两帧相邻图像的时间间隔为以及该世界坐标的X、Y偏移量计算得出飞行装置在X方向上的速率以及在Y方向上的速率。
本发明的有益效果在于可在GPS信号较弱或没有GPS信号时仍然能够进行速度侦测及定位控制,并可基于不同的场景进行精确的控制。
【附图说明】
图1为本发明一实施方式中的飞行装置的硬件架构示意图。
图2为本发明一实施方式中的飞行控制***的模块图。
图3为本发明一实施方式中的世界坐标的X、Y偏移量的说明示意图。
图4为本发明一实施方式中的世界坐标的X、Y偏移量与图像校正偏移量的关系示意图。
图5为本发明一实施方式中的飞行控制方法的流程图。
附图标记:
飞行装置                                         100
处理器                                           10
相机模组                                         20
加速度传感器                                     30
距离传感器                                       40
飞行控制***                                     S1
获取模块                                         11
场景确定模块                                     12
图像偏移量确定模块                               13
偏移量校准模块                                   14
速度计算模块                                     15
运行控制模块                                     16
摄像头                                           21、22
镜头                                             201
图像传感器                                       202
图像                                             P1、P2
物体                                             A
成像点                                           A1
镜头焦距                                          f
高度                                              H
校正后的图像X偏移量                               x1
校正后的图像Y偏移量                               y1
世界坐标的X偏移量                                 X1
世界坐标的Y偏移量                                 Y1
步骤                                              501-511
【具体实施方式】
为了使本发明的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本发明进行进一步详细说明。应当理解,此处所描述的具体实施例仅用以解释本发明,并不用于限定本发明。
请参阅图1,为一实施方式中的飞行装置100的硬件架构示意图。该飞行装置100包括处理器10、相机模组20以及加速度传感器30。
该相机模组20用于每间隔预定时间采集图像,例如每间隔1秒采集一次图像。在本实施方式中,该相机模组20为一双目相机模组,包括两个摄像头21、22,该相机模组20通过该两个摄像头21、22分别获取一图像。该加速度传感器30用于侦测飞行装置100在三维方向(X、Y、Z)上的加速度及角速度。其中,在一实施方式中,该加速度传感器30可为陀螺仪,该距离传感器40可为超声波传感器。
请一并参阅图2,该处理器10运行有一飞行控制***S1。如图2所示,该飞行控制***S1包括获取模块11、场景确定模块12、高度确定模块13、图像偏移量确定模块14、偏移量校准模块15、速度计算模块16及运行控制模块17。所述飞行控制***S1用于侦测飞行装置100的速度并进行定位控制。其中,该飞行控制***S1的各个模块可为程序化的指令模块,并可被处理器10调用执行,该飞行控制***S1的各个模块也可为固化于处理器10中的固件(firmware)。在一应用中,该飞行控制***S1可为一安装于飞行装置100中的应用软件。
该获取模块11用于获取相机模组20采集的图像。在本实施方式中,该获取模块11实时获取相机模组20采集的图像。
场景确定模块12用于确定飞行装置100当前所处的场景。具体的,场景 确定模块12根据图像中至少一个参数特征确定飞行装置100所处的场景。
高度确定模块13用于根据飞获取模块11获取的相机模组20的摄像头21、22采集的图像计算出景深信息,并根据景深信息确定飞行装置100的高度。
该图像偏移量确定模块14用于根据获取模块11获取的相机模组20所采集的两帧相邻图像以及场景确定模块12确定的飞行装置100当前所处的场景,计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X偏移量及图像Y偏移量。其中,该图像偏移量确定模块14获取的为相机模组20中任一摄像头连续采集的两帧相邻图像。在一实施方式中,图像偏移量确定模块14根据飞行装置100所处场景对应的算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量。在另一实施方式中,该图像偏移量确定模块14可确定该两帧相邻图像中的相同特征点,并计算该相同特征点在该两帧相邻图像中的X、Y偏移量而得到该图像X、Y偏移量。其中,该相同特征点为同一物体在该两帧相邻图像中的成像点,该图像X、Y偏移量为该同一物体在该两帧图像中成像点的X方向上与Y方向上的偏移量。
偏移量校准模块15用于获取加速度传感器30侦测到的飞行装置100在三维方向上的加速度及角速度,并根据该飞行装置100的加速度及角速度对该图像X、Y偏移量进行补偿而得到一图像校正偏移量。
速度计算模块16用于通过镜头焦距、飞行装置100的高度以及该图像校正偏移量计算该图像校正偏移量对应的世界坐标的X、Y偏移量,即对应到真实世界的实际X、Y偏移量。其中,本发明的X、Y分别指三维坐标***中与地面平行的平面上的横轴方向和纵轴方向。世界坐标的X、Y偏移量为飞行装置100/相机模组20相对于地面在X方向和Y方向上的移动距离。
请一并参阅图3,为世界坐标的X、Y偏移量的说明示意图。其中,由于飞行装置100在拍摄两帧相邻图像P1、P2的时间间隔内会移动,因此,对于一实际的物体A而言,飞行装置100的相机模组20在采集图像时,相对该实际物体A会发生相对运动。如图3所示,相机模组20的摄像头21、22中的每一个均包括镜头201及图像传感器202。当该飞行装置100在拍摄两帧相邻图像P1、P2的时间间隔内向右上方移动一定距离时,导致该实际物体A通过飞行装置100的镜头201在图像传感器202上成像后,该实际物体A在该两帧相邻图像P1、P2中的成像点A1会发生如图3中所示的朝左下方的偏移,而形 成图像X偏移量和图像Y偏移量,而该图像X偏移量和图像Y偏移量经过补偿校正后即为图像校正偏移量。因此,该图像校正偏移量与世界坐标的X、Y偏移量呈一定的对应关系,可根据该图像校正偏移量等得到世界坐标的X、Y偏移量。其中,相机模组20可为照相机、摄像机、摄像头等。该图像校正偏移量为物体A在该两帧相邻图像P1、P2中的成像点A1对应在图像传感器202上的X方向和Y方向的实际距离。
请一并参阅图4,为一世界坐标的X、Y偏移量与图像校正偏移量的关系示意图。设镜头焦距为f,飞行装置100的高度为H,图像校正偏移量中校正后的图像X偏移量为x1,校正后的图像Y偏移量为y1,世界坐标的X偏移量为X1,Y偏移量为Y1。显然,当相机模组20向下拍摄图像时,物距即为该高度H。如图4可见,世界坐标的X或Y偏移量与校正后的图像X偏移量或校正后的图像Y偏移量之比等于镜头焦距为f与高度H之比。即,世界坐标的X、Y偏移量与图像校正偏移量的关系分别满足公式1:x1/X1=f/H以及公式2:y1/Y1=f/H。由于,焦距f、高度H以及图像校正偏移量中的校正后的X偏移量x1及Y偏移量y1均为已知,因此,速度计算模块16可根据镜头焦距、飞行装置100的高度以及该图像校正偏移量,通过上述的公式1和公式2分别求出世界坐标的X、Y偏移量X1、Y1。
该速度计算模块16并根据该两帧相邻图像采集的时间间隔t1以及该世界坐标的X、Y偏移量求出该飞行装置的速度。如前所述,该世界坐标的X、Y偏移量为该飞行装置在该两帧相邻图像获取的时间间隔内在X和Y方向上的移动距离,设该相机模组20采集该两帧相邻图像的时间间隔为t1,该速度计算模块16根据该时间间隔t1以及该世界坐标的X、Y偏移量计算得出飞行装置在X方向上的速率为X1/t1以及在Y方向上的速率为Y1/t1。在一实施方式中,该速度计算模块16先求出世界坐标的X、Y偏移量的矢量和得出该飞行装置100的实际位移D1,然后计算飞行装置100的实际速度为D1/t1。
该运行控制模块16用于至少基于该飞行装置100的速度对飞行装置100进行定位及/或悬停控制。例如,该运行控制模块16根据该飞行装置100的速度以及飞行装置100与目的地之间的距离计算所需的时间,并在所需时间小于一预定值时进行悬停或降落前的准备。在一实施方式中,该运行控制模块16判断当前计算的速度和方向与上一时刻计算的速度基本相等但方向相反时,则确定飞行装置100飞行速度接近为0,运动距离非常小,如1cm的运动距离, 则控制飞行装置100在某一位置悬停。
因此,本发明中,在无GPS信号的情况下,通过拍摄的图片也能够计算出飞行装置100的速度,并进行定位控制。
该飞行装置100为无人飞行器。
在本实施方式中,该场景确定模块12为根据相机模组20采集的图像自动确定飞行装置100当前所处的场景。在其他实施方式中,该场景确定模块12也可根据用户选择的场景确定该用户选择的场景为飞行装置100当前所处的场景。
如图1所示,在一实施方式中,所述飞行装置100还包括一距离传感器40,所述距离传感器40用于侦测飞行装置100与地面的距离,即用于侦测飞行装置100的高度。所述高度确定模块13用于根据飞行装置100当前所处的场景,确定根据相机模组20的采集的图像计算出的景深信息来得出飞行装置100的高度或选择获取距离传感器40侦测到的距离作为飞行装置100的高度。
其中,该场景确定模块12根据相机模组20采集的图像中的至少一个参数特征确定所述场景的类型。
其中,该至少一个参数包括纹理,所述场景确定模块12通过采用sobel梯度算法对图像进行处理,得到梯度矩阵,统计梯度矩阵中梯度大于阈值T1的像素的个数C,并判断值C大于阈值T2时认为纹理比较丰富,否则认为纹理比较少。
该至少一个参数还包括倒影,所述场景确定模块12根据连续多帧图像进行阴影统计,判断是否有倒影,从而确定该场景为有倒影的场景或无倒影的场景。具体的,当所述场景确定模块12判断所述连续多帧图像中存在偏暗和偏亮的情况或者进行无人机灯光倒影检测,根据无人机灯的形状进行检测,判断有符合无人机灯的形状时,确定该场景为有倒影的场景。具体的,所述场景确定模块根据倒影检测算法进行判断。一般,对于容易反光的地面,无人机的圆形灯光会在图像中呈现亮度偏亮的灰度图像,因此场景确定模块12对灰度图像每个像素进行判断是否大于或等于阈值T,其中T根据经验值设定,可以设置为220;场景确定模块12并将大于等于T的灰度值像素设置为255,将小于T的灰度值像素设置为0,以此将图像转换为二值图像,其中0为背景,255为前景,然后进行连通区域提取,采用外接矩形表示;场景确定模块12并进行目标大小判断,符合目标大小范围的认为是灯光倒影目标。其中,目标大小 范围是根据不同的高度灯倒影的测量值得到。
该至少一个参数还包括灰度,所述场景确定模块12将根据图像的灰度值将图像转换为灰度直方图并进行统计,并与相应的阈值进行比较,检测出场景是否属于偏暗、普通亮度、或者偏亮。具体的,所述场景确定模块12根据灰度直方图的平均亮度L给定一个阈值T,当判断L<80(坎德拉)时属于偏暗场景,L>170属于偏亮场景,80<L<170时属于普通亮度场景。
该至少一个参数还可包括线条,所述场景确定模块12对图像做梯度检测,进行二值化处理,然后使用传统的hough直线检测,判断是否有直线,如果判断有至少一条直线且直线长度占图像宽度至少1/2的,则判断所述场景为线条丰富场景。
在一实施方式中,该场景确定模块12单独根据上述的一个参数以及相应的算法确定当前场景的类型。在其他实施方式中,该场景确定模块也可同时根据上述的多个参数以及多个相应的算法确定所述场景。例如,该场景确定模块12可同时根据纹理、线条判断当前的场景是否为纹理丰富且有线条的场景。
其中,高度确定模块13根据场景是否为纹理丰富场景以及相同纹理的最大区域面积去选择通过相机模组20的摄像头21、22采集的图像计算出景深信息作为飞行装置100的高度或选择获取距离传感器40侦测到的距离作为飞行装置100的高度。具体的,高度确定模块13根据飞行装置100所处的场景,选择通过相机模组20的摄像头21、22采集的图像计算出景深信息作为飞行装置100的高度或选择获取距离传感器40侦测到的距离作为飞行装置100的高度包括如下的步骤:
在纹理丰富场景下,若深度图像相近深度值的最大区域面积AM大于SMIN小于SMAX,则高度确定模块13选择通过相机模组20的摄像头21、22采集的图像计算出景深信息作为飞行装置100的高度。其中,SMIN可以设定为图像最大面积(图像宽度*图像高度)的1/4,SMAX设定为图像最大面积的3/4。
在纹理丰富场景下,若最大区域面积AM大于SMAX,且通过相机模组20的摄像头21、22采集的图像计算出景深信息得出的该区域对应的高度H和距离传感器40测量的高度的差值超出一个阈值(例如10CM),则认为超声测量不准,该场景超声测量不太准确,高度确定模块13选择相机模组20的摄像头21、22采集的图像计算出的景深信息作为飞行装置100的高度。
由于对于无纹理图像如纹理非常少的地板砖,得不到精确的结果,从而得到的深度信息不准确,高度测量不准确,超声在该场景能得到比较精确的高度测量。因此,在纹理不丰富场景下,高度确定模块13采用距离传感器40测量的高度。
在纹理丰富场景下,若最大区域面积AM小于SMIN或者大于SMAX(如上所述,SMIN设定为图像最大面积(图像宽度*图像高度)的1/4,SMAX设定为图像最大面积的3/4),则高度确定模块13采用距离传感器40测量的高度也采用距离传感器40测量的高度。
其中,高度确定模块13通过相机模组20的摄像头21、22采集的图像计算出景深信息包括:通过对两个摄像头21、22进行标定,标定的目的是获取相机的内参(焦距,图像中心,畸变系数等)和外参R(旋转)矩阵T(平移)矩阵)(具体步骤为:1)左摄像头21标定,获取内外参数;2)右摄像头22标定获取外参;3)双目标定,获取相机之间的平移旋转关系);然后对双目进行矫正,例如进行畸变等矫正;然后进行立体匹配,得到视差图;最后进行3D(三维)重建,得到深度信息,将深度图进行灰度化,归一化到[0,255]范围而得到高度H。
其中,该图像偏移量确定模块14根据两帧相邻图像以及飞行装置100当前所处的场景计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像水平偏移量包括:图像偏移量确定模块14根据飞行装置100所处场景对应的算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量。
具体的,对于纹理丰富的场景,该图像偏移量确定模块14采用灰度模板匹配算法。具体包括:设当前图像宽高分别为W和H,设一模板图T大小为Mx×My,Mx=W-8,My=H-8,模板图T从当前帧图像[4,4]的位置获得;匹配图S大小为Nx×Ny,其中Nx=W,Ny=H,匹配图S从上一帧图像获得。匹配时模板图叠放在匹配图上平移,模板图覆盖下的那块基准图中的搜索子图为S(i,j),i,j为这块子图的左上角像点在匹配图S中的位置,i和j在[-4,4]范围内取值。S(0,0)对应A的[4,4]位置。匹配时通过计算相关函数SAD来找到与模板图尽可能相似的搜索子图以及它的坐标位置i和j,T和S(i,j)的SAD最小的值为最佳匹配位置,也就是相邻两帧图像的X、Y方向相对偏移量或者移动量,即两帧相邻图像的图像X、Y偏移量,其中,偏移量在[-4, 4]范围。SAD是指将两幅图像每个位置对应像素差值的绝对值进行累加求和的过程。SAD值越小,两个图像的匹配度越高,可以作为最佳匹配。
对于纹理比较少的场景,该图像偏移量确定模块14采用sobel梯度模板匹配算法。具体的,该图像偏移量确定模块14通过Sobel算子进行边缘检测,其使用二维模板进行计算,二维模板包括水平模板和垂直模板。如下图,水平模板用于水平方向差分运算,垂直模板用于垂直方向差分运算。
水平模板:
-1 0 1
-2 0 2
-1 0 1
垂直模板:
-1 -2 -1
0 0 0
1 2 1
该图像偏移量确定模块14使用上述模板进行平面卷积运算,分别计算水平方向卷积fx和垂直方向卷积fy,求其梯度值G为fx的平方与fy的平方和的平方根。然后,分别对相邻两幅图像做梯度运算得到梯度值矩阵A和B。其中A为上帧图像的Sobel梯度矩阵,B为当前帧图像的Sobel梯度矩阵。设模板图T大小为Mx X My,Mx=W-8,My=H-8,模板图T从B的[4,4]位置获得;匹配图S大小为Nx×Ny,其中Nx=W,Ny=H,匹配图S从A获得。匹配时模板图叠放在匹配图上平移,模板图覆盖下的那块基准图中的搜索子图为S(i,j),i,j为这块子图的左上角像点在匹配图S中的位置,i和j在[-4,4]范围内取值。S(0,0)对应A的[4,4]位置。对梯度值矩阵T和S(i,j)做差运算法,得到矩阵差值矩阵C。将C矩阵中符合下面条件(如下条件1和条件2)的每个元素绝对值进行累加,得到和SS(i,j)。
A[r,c]>T(条件1)
B[r,c]>T(条件2)
其中A[r,c]是梯度值矩阵A中r,c位置的梯度值,[r,c]是梯度值矩阵B中r、c位置的梯度值,r>=0且r<My,c>=0且c<Mx,T是梯度阈值。其中SS(i,j)最小的值对应的(i,j)位置为最佳匹配位置,也就是相邻两帧图像的X、Y方向相对偏移量或者移动量,即两帧相邻图像的图像X、Y偏移量,其中,偏移量在[-4,4]范围。
其中,该图像偏移量确定模块14在飞行装置100当前所处的场景为光照特别亮或者特别暗的场景时,对图像进行直方图均衡化,将图像亮度调为均匀,然后判断是否为纹理丰富的场景,并根据如前所述的关系根据纹理是否丰富选择对应的算法对图像进行处理。
该图像偏移量确定模块14在飞行装置100当前所处的场景为有倒影场景时,去除图像中的阴影,并去除阴影后,判断是否为纹理丰富的场景,并根据如前所述的关系根据纹理是否丰富选择对应的算法。
该图像偏移量确定模块14在飞行装置100当前所处的场景为线条丰富的场景时,划分直线为水平方向的直线和垂直方向的直线,在上一帧图像中找一条和当前图像检测到的水平方向的直线Lch角度差最小的直线Lph。计算两条直线的距离得到j,当j不在[-R,R]时设置为0,其中R是设定的移动范围,一般R取4。在上一帧图像中找一条和当前图像检测到的垂直方向的直线Lcv角度差最小的直线Lcv。计算两条直线的距离得到i,当i不在[-R,R]时设置为0。直线定位得到的i,j就是相邻两帧图像的X、Y方向相对偏移量或者移动量,即两帧相邻图像的图像X、Y偏移量,其中,偏移量在[-4,4]范围。
从而,根据本发明的飞行装置100及飞行控制***S1,可在无GPS信号时确定飞行装置100的速度并进行定位控制,并能够进一步基于不同的场景进行精确的控制。
请参阅图5,为本发明一实施方式中的飞行控制方法的流程图。该飞行控制方法用于侦测飞行装置100的速度并进行定位控制。首先,获取模块11获取相机模组20采集的图像(501)。
场景确定模块12确定飞行装置100当前所处的场景(503)。其中,该场景确定模块12根据相机模组20采集的图像中的至少一个参数特征确定所述场景的类型。
高度确定模块13用于根据双目相机模组20采集的图像的景深信息确定飞行装置100的高度(505)。
图像偏移量确定模块14根据相机模组20采集的两帧相邻图像以及飞行装置100当前所处的场景,计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量(505)。其中,该图像偏移量确定模块14确定该两帧相邻图像中的相同特征点,并计算该相同特征点在该两帧相邻图像中的X、Y偏移量而得到该图像X、Y偏移量。其中,该场景确定模块12根据相机模组20采集的图像中的至少一个参数确定所述场景的类型,并根据飞行装置100当前所处的场景选择一对应的算法,并根据该算法对该两帧相邻图像进行分析处理而得出第二帧图像相对于第一帧图像的图像水平偏移量。
偏移量校准模块15获取加速度传感器30侦测到的飞行装置100在三维方向上的加速度及角速度,并根据该飞行装置的加速度及角速度对该图像X、Y偏移量进行补偿而得到一图像校正偏移量(507)。
速度计算模块16用于通过镜头焦距、飞行装置100的高度以及该图像校正偏移量计算该图像校正偏移量对应的世界坐标的X、Y偏移量,并根据该两帧相邻图像获取的时间间隔以及该世界坐标的X、Y偏移量求出该飞行装置的速度(509)。具体的,该速度计算模块16根据该时间间隔t1以及该世界坐标的X、Y偏移量计算得出飞行装置在X方向上的速率以及在Y方向上的速率。更具体的,该速度计算模块16用X、Y偏移量分别除以时间间隔而分别得出飞行装置100在X方向和Y方向上的速率。
该运行控制模块16至少基于该飞行装置100的速度对飞行装置100进行定位及/或悬停控制(511)。
以上所述仅为本发明的较佳实施例而已,并不用以限制本发明,凡在本发明的精神和原则之内所作的任何修改、等同替换和改进等,均应包含在本发明的保护范围之内。

Claims (20)

  1. 一种飞行控制***,用于控制飞行装置,其特征在于,所述飞行控制***包括:
    获取模块,用于获取飞行装置的双目相机模组采集的图像;
    场景确定模块,用于确定飞行装置当前所处的场景;
    高度确定模块,用于根据双目相机模组采集的图像的景深信息确定飞行装置的高度;
    图像偏移量确定模块,用于根据双目相机模组采集的两帧相邻图像以及飞行装置当前所处的场景,计算所述两帧相邻图像中的第二帧图像相对于第一帧图像的图像X偏移量及图像Y偏移量;
    偏移量校准模块,用于获取飞行装置的加速度传感器侦测到的飞行装置在三维方向上的加速度及角速度,并根据所述飞行装置的加速度及角速度对所述图像X、Y偏移量进行补偿而得到包括校正后的图像X偏移量和图像Y偏移量的图像校正偏移量;以及
    速度计算模块,用于通过双目相机模组的镜头焦距、飞行装置的高度以及所述图像校正偏移量计算所述图像校正偏移量对应的世界坐标的X、Y偏移量,并根据所述两帧相邻图像采集的时间间隔以及所述世界坐标的X、Y偏移量求出所述飞行装置的速度。
  2. 如权利要求1所述的飞行控制***,其特征在于,高度确定模块根据双目相机模组采集的图像的景深信息确定飞行装置的高度包括:高度确定模块通过对双目相机模组的两个摄像头进行标定,以获取相机模组的包括焦距,图像中心,畸变系数在内的内参和以及包括旋转矩阵和平移矩阵的外参;高度确定模块并对双目进行矫正,然后进行立体匹配,得到视差图,并进行三维重建,得到深度信息,归一化到[0,255]范围得到深度图。
  3. 如权利要求1所述的飞行控制***,其特征在于,所述飞行装置还包括一距离传感器,所述距离传感器用于侦测飞行装置与地面的距离,所述高度确定模块用于根据飞行装置当前所处的场景,确定根据相机模组采集的图像的景深信息来得出飞行装置的高度或选择获取距离传感器侦测到的距离作为飞 行装置的高度。
  4. 如权利要求3所述的飞行控制***,其特征在于,所述场景确定模块根据相机模组采集的图像中的至少一个参数确定飞行装置所处的场景是否为纹理丰富场景或纹理较少场景,所述参数包括但不限于纹理。
  5. 如权利要求4所述的飞行控制***,其特征在于,所述高度确定模块根据场景是否为纹理丰富场景以及相同纹理的最大区域面积去选择根据相机模组采集的图像的景深信息确定飞行装置的高度或选择获取距离传感器侦测到的距离作为飞行装置的高度。
  6. 如权利要求5所述的飞行控制***,其特征在于,在纹理丰富场景下,所述高度确定模块若判断深度图像相近深度值最最大区域面积AM大于最小值SMIN小于最大值SMAX,则高度确定模块选择根据相机模组的摄像头采集的图像的景深信息确定飞行装置的高度,其中,最小值SMIN为图像最大面积的1/4,最大值SMAX设定为图像最大面积的3/4。
  7. 如权利要求5所述的飞行控制***,其特征在于,在纹理丰富场景下,所述高度确定模块若判断深度图像相近深度值最大区域面积AM大于最大值SMAX,且通过相机模组采集的图像的景深信息得出的该区域对应的高度和距离传感器测量的高度的差值超出一个阈值,则选择相机模组采集的图像计算出的景深信息确定飞行装置的高度,其中,最大值SMAX为图像最大面积的3/4。
  8. 如权利要求5所述的飞行控制***,其特征在于,在纹理丰富场景下,所述高度确定模块若判断深度图像相近深度值最大区域面积AM小于最小值SMIN或者大于最大值SMAX,则高度确定模块选择相机模组采集的图像的景深信息确定飞行装置的高度或采用距离传感器测量的距离作为飞行装置的高度,其中,最小值SMIN为图像最大面积的1/4,最大值SMAX为图像最大面积的3/4。
  9. 如权利要求5所述的飞行控制***,其特征在于,在纹理不丰富场景 下,高度确定模块采用距离传感器测量的距离作为飞行装置的高度。
  10. 如权利要求1所述的飞行控制***,其特征在于,所述速度计算模块根据公式1:x1/X1=f/H计算所述世界坐标的X偏移量,以及根据公式2:y1/Y1=f/H计算所述世界坐标的Y偏移量,其中,x1为校正后的图像X偏移量,y1为校正后的图像Y偏移量,f为镜头焦距,H为飞行装置的高度,X1为世界坐标的X偏移量,Y1为世界坐标的Y偏移量;所述速度计算模块并根据该相机模组采集该两帧相邻图像的时间间隔为以及该世界坐标的X、Y偏移量计算得出飞行装置在X方向上的速率以及在Y方向上的速率。
  11. 一种飞行控制方法,用于控制飞行装置,其特征在于,所述飞行控制方法包括:
    获取飞行装置的双目相机模组采集的图像;
    确定飞行装置当前所处的场景;
    根据双目相机模组采集的图像的景深信息确定飞行装置的高度;
    根据双目相机模组采集的两帧相邻图像以及飞行装置当前所处的场景,计算所述两帧相邻图像中的第二帧图像相对于第一帧图像的图像X偏移量和图像Y偏移量;
    获取飞行装置的加速度传感器侦测到的飞行装置在三维方向上的加速度及角速度,并根据所述飞行装置的加速度及角速度对所述图像X偏移量和图像Y偏移量进行补偿而得到包括校正后的图像X偏移量和图像Y偏移量的图像校正偏移量;以及
    通过相机模组的镜头焦距、飞行装置的高度以及所述图像校正偏移量计算所述图像校正偏移量对应的世界坐标的X、Y偏移量,并根据所述两帧相邻图像采集的时间间隔以及所述世界坐标的X、Y偏移量求出所述飞行装置的速度。
  12. 如权利要求11所述的飞行控制方法,其特征在于,所述步骤“根据双目相机模组采集的图像的景深信息确定飞行装置的高度”包括:
    通过对双目相机模组的两个摄像头进行标定,以获取相机模组的包括焦距,图像中心,畸变系数在内的内参和以及包括旋转矩阵和平移矩阵的外参;
    对双目进行矫正;
    进行立体匹配,得到视差图;以及
    进行三维重建,得到深度信息,归一化到[0,255]范围得到深度图。
  13. 如权利要求11所述的飞行控制方法,其特征在于,所述方法还包括:
    根据飞行装置当前所处的场景,确定根据相机模组采集的图像的景深信息来得出飞行装置的高度或选择获取距离传感器侦测到的距离作为飞行装置的高度。
  14. 如权利要求13所述的飞行控制方法,其特征在于,所述步骤“确定飞行装置当前所处的场景”包括:
    根据相机模组采集的图像中的至少一个参数确定飞行装置所处的场景是否为纹理丰富场景或纹理较少场景,所述参数包括但不限于纹理。
  15. 如权利要求14所述的飞行控制方法,其特征在于,所述步骤“根据飞行装置当前所处的场景,确定根据相机模组采集的图像的景深信息来得出飞行装置的高度或选择获取距离传感器侦测到的距离作为飞行装置的高度”包括:
    在纹理丰富场景下,若深度图像相近深度值最大区域面积AM大于最小值SMIN小于最大值SMAX,则选择根据相机模组的摄像头采集的图像的景深信息确定飞行装置的高度,其中,最小值SMIN为图像最大面积的1/4,最大值SMAX设定为图像最大面积的3/4。
  16. 如权利要求14所述的飞行控制方法,其特征在于,所述步骤“根据飞行装置当前所处的场景,确定根据相机模组采集的图像的景深信息来得出飞行装置的高度或选择获取距离传感器侦测到的距离作为飞行装置的高度”包括:
    在纹理丰富场景下,若判断深度图像相近深度值最大区域面积AM大于最大值SMAX,且通过相机模组采集的图像的景深信息得出的该区域对应的高度和距离传感器测量的高度的差值超出一个阈值,则选择相机模组采集的图像计算出的景深信息确定飞行装置的高度,其中,最大值SMAX为图像最大面积的3/4。
  17. 如权利要求14所述的飞行控制方法,其特征在于,所述步骤“根据飞行装置当前所处的场景,确定根据相机模组采集的图像的景深信息来得出飞行装置的高度或选择获取距离传感器侦测到的距离作为飞行装置的高度”包括:
    在纹理丰富场景下,所述高度确定模块若判断深度图像相近深度值最大区域面积AM小于最小值SMIN或者大于最大值SMAX,则选择相机模组采集的图像的景深信息确定飞行装置的高度或采用距离传感器测量的距离作为飞行装置的高度,其中,最小值SMIN为图像最大面积的1/4,最大值SMAX为图像最大面积的3/4。
  18. 如权利要求14所述的飞行控制方法,其特征在于,所述步骤“根据飞行装置当前所处的场景,确定根据相机模组采集的图像的景深信息来得出飞行装置的高度或选择获取距离传感器侦测到的距离作为飞行装置的高度”包括:
    在纹理不丰富场景下,采用距离传感器测量的距离作为飞行装置的高度。
  19. 如权利要求11所述的飞行控制方法,其特征在于,所述步骤“通过双目相机模组的镜头焦距、飞行装置的高度以及所述图像校正偏移量计算所述图像校正偏移量对应的世界坐标的X、Y偏移量”包括:
    根据公式:x1/X1=f/H计算所述世界坐标的X偏移量;以及
    根据公式:y1/Y1=f/H计算所述世界坐标的Y偏移量,其中,x1为图像X偏移量,y1为图像Y偏移量,f为镜头焦距,H为飞行装置的高度,X1为世界坐标的X偏移量,Y1为世界坐标的Y偏移量。
  20. 一种飞行装置,包括双目相机模组、距离传感器以及加速度传感器,所述双目相机模组包括两个摄像头,用于分别采集图像,所述距离传感器用于以及获取飞行装置的高度,所述加速度传感器用于侦测到的飞行装置在三维方向上的加速度及角速度,其特征在于,所述飞行装置还包括如权利要求1-10中任一项所述的飞行控制***。
PCT/CN2016/071016 2015-11-13 2016-01-15 飞行装置、飞行控制***及方法 WO2017080108A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/625,225 US10234873B2 (en) 2015-11-13 2017-06-16 Flight device, flight control system and method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510778779.4 2015-11-13
CN201510778779.4A CN105346706B (zh) 2015-11-13 2015-11-13 飞行装置、飞行控制***及方法

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/625,225 Continuation US10234873B2 (en) 2015-11-13 2017-06-16 Flight device, flight control system and method

Publications (1)

Publication Number Publication Date
WO2017080108A1 true WO2017080108A1 (zh) 2017-05-18

Family

ID=55322823

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/071016 WO2017080108A1 (zh) 2015-11-13 2016-01-15 飞行装置、飞行控制***及方法

Country Status (3)

Country Link
US (1) US10234873B2 (zh)
CN (1) CN105346706B (zh)
WO (1) WO2017080108A1 (zh)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108073184A (zh) * 2017-11-27 2018-05-25 天脉聚源(北京)传媒科技有限公司 无人机飞行控制方法及装置
CN109410140A (zh) * 2018-10-24 2019-03-01 京东方科技集团股份有限公司 一种畸变校正方法、装置、***及计算机可读存储介质
CN109901202A (zh) * 2019-03-18 2019-06-18 成都希德瑞光科技有限公司 一种基于点云数据的机载***位置修正方法
CN109923583A (zh) * 2017-07-07 2019-06-21 深圳市大疆创新科技有限公司 一种姿态的识别方法、设备及可移动平台
WO2020128944A1 (es) * 2018-12-19 2020-06-25 Pontificia Universidad Javeriana Método de detección de líneas de transmisión de energía en tiempo real con vehículos aéreos no tripulados
CN111540003A (zh) * 2020-04-27 2020-08-14 浙江光珀智能科技有限公司 一种深度图像的生成方法及装置
CN112016570A (zh) * 2019-12-12 2020-12-01 天目爱视(北京)科技有限公司 用于背景板同步旋转采集中的三维模型生成方法
CN112650298A (zh) * 2020-12-30 2021-04-13 广东工业大学 一种无人机追踪降落方法及***

Families Citing this family (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN203786564U (zh) * 2014-04-22 2014-08-20 零度智控(北京)智能科技有限公司 一种双余度飞行控制***
CN105761265A (zh) * 2016-02-23 2016-07-13 英华达(上海)科技有限公司 利用影像深度信息提供避障的方法及无人飞行载具
CN113238581A (zh) * 2016-02-29 2021-08-10 星克跃尔株式会社 无人飞行器的飞行控制的方法和***
CN106094861B (zh) * 2016-06-02 2024-01-12 零度智控(北京)智能科技有限公司 无人机、无人机控制方法及装置
CN107616742B (zh) * 2016-07-14 2020-09-15 南京海益开电子科技有限公司 一种幕墙无人清洁主机控制***
CN109324634B (zh) * 2016-09-30 2021-08-03 广州亿航智能技术有限公司 一种飞行器及其定位方法、控制方法及光流模块
WO2018086133A1 (en) 2016-11-14 2018-05-17 SZ DJI Technology Co., Ltd. Methods and systems for selective sensor fusion
CN106767817B (zh) * 2016-12-01 2019-01-04 腾讯科技(深圳)有限公司 一种获取飞行定位信息的方法及飞行器
CN106767682A (zh) * 2016-12-01 2017-05-31 腾讯科技(深圳)有限公司 一种获取飞行高度信息的方法及飞行器
CN108510540B (zh) * 2017-02-23 2020-02-07 杭州海康威视数字技术股份有限公司 立体视觉摄像机及其高度获取方法
US10317207B2 (en) * 2017-03-09 2019-06-11 Moxa Inc. Three-dimensional trace verification apparatus and method thereof
CN107292929A (zh) * 2017-05-08 2017-10-24 深圳市唯内德软件开发有限公司 低功耗特征点图像定位方法及装置
WO2019109226A1 (zh) * 2017-12-04 2019-06-13 深圳市沃特沃德股份有限公司 双目摄像头标定方法及装置
WO2019126930A1 (zh) * 2017-12-25 2019-07-04 深圳市道通智能航空技术有限公司 测距方法、装置以及无人机
US10769466B2 (en) * 2018-02-20 2020-09-08 International Business Machines Corporation Precision aware drone-based object mapping based on spatial pattern recognition
JP6752481B2 (ja) * 2018-02-28 2020-09-09 株式会社ナイルワークス ドローン、その制御方法、および、プログラム
WO2020024185A1 (en) * 2018-08-01 2020-02-06 SZ DJI Technology Co., Ltd. Techniques for motion-based automatic image capture
WO2020065719A1 (ja) * 2018-09-25 2020-04-02 株式会社エアロネクスト 飛行体
WO2020116232A1 (ja) * 2018-12-05 2020-06-11 ソニー株式会社 制御装置、制御方法、およびプログラム
US10937325B2 (en) 2018-12-27 2021-03-02 Intel Corporation Collision avoidance system, depth imaging system, vehicle, obstacle map generator, and methods thereof
CN110248094B (zh) * 2019-06-25 2020-05-05 珠海格力电器股份有限公司 拍摄方法及拍摄终端
CN110658831B (zh) * 2019-10-22 2022-04-15 深圳市道通智能航空技术股份有限公司 对地高度校正方法及装置、无人飞行器
CN110825109B (zh) * 2019-11-12 2023-11-21 广州极飞科技股份有限公司 光谱信息的获取方法及装置、飞行器的控制方法
CN110806210B (zh) * 2019-11-18 2023-06-23 中国航空工业集团公司沈阳飞机设计研究所 一种无人机校准测向设备航线规划方法及其飞行控制方法
JP6681105B2 (ja) * 2019-11-28 2020-04-15 株式会社エアロネクスト 飛行体
CN110956585B (zh) * 2019-11-29 2020-09-15 深圳市英博超算科技有限公司 全景图像拼接方法、装置以及计算机可读存储介质
CN111142550B (zh) * 2020-01-09 2021-07-27 上海交通大学 民用飞机辅助驾驶控制方法、***及飞行品质评估方法
CN111951295B (zh) * 2020-07-07 2024-02-27 中国人民解放军93114部队 基于多项式拟合高精度确定飞行轨迹的方法、装置和电子设备
CN112286213A (zh) * 2020-10-21 2021-01-29 苏州臻迪智能科技有限公司 一种无人机悬停方法、装置,无人机及存储介质
CN114616530A (zh) * 2020-12-30 2022-06-10 深圳市大疆创新科技有限公司 喷洒作业控制方法、装置、农业无人飞行器及存储介质
CN114155290B (zh) * 2021-11-18 2022-09-09 合肥富煌君达高科信息技术有限公司 一种用于大视场高速运动测量的***与方法
CN115379123B (zh) * 2022-10-26 2023-01-31 山东华尚电气有限公司 一种用无人机巡检的变压器故障检测方法
CN117152026B (zh) * 2023-10-30 2024-02-02 天津恒宇医疗科技有限公司 一种血管内超声图像处理方法、装置和设备
CN117527137B (zh) * 2024-01-06 2024-05-31 北京领云时代科技有限公司 一种基于人工智能的干扰无人机通信的***及方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6804607B1 (en) * 2001-04-17 2004-10-12 Derek Wood Collision avoidance system and method utilizing variable surveillance envelope
CN101554925A (zh) * 2009-05-11 2009-10-14 董韬 一种无人机正射影像云台
US20110184593A1 (en) * 2006-04-19 2011-07-28 Swope John M System for facilitating control of an aircraft
CN202075794U (zh) * 2011-05-24 2011-12-14 段连飞 一种无人机航摄立体成像处理设备
CN103941746A (zh) * 2014-03-29 2014-07-23 国家电网公司 无人机巡检图像处理***及方法
CN104656664A (zh) * 2015-02-16 2015-05-27 南京航空航天大学 车载多旋翼无人直升机着陆导引控制***及导引控制方法

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2961601B1 (fr) * 2010-06-22 2012-07-27 Parrot Procede d'evaluation de la vitesse horizontale d'un drone, notamment d'un drone apte au vol stationnaire autopilote
US20140362176A1 (en) * 2013-01-05 2014-12-11 Patrick A. St. Clair Spherical panoramic imaging system
US9398287B2 (en) * 2013-02-28 2016-07-19 Google Technology Holdings LLC Context-based depth sensor control
US9646384B2 (en) * 2013-09-11 2017-05-09 Google Technology Holdings LLC 3D feature descriptors with camera pose information
CN104236548B (zh) * 2014-09-12 2017-04-05 清华大学 一种微型无人机室内自主导航方法
US11768508B2 (en) * 2015-02-13 2023-09-26 Skydio, Inc. Unmanned aerial vehicle sensor activation and correlation system
US20160295108A1 (en) * 2015-04-01 2016-10-06 Cheng Cao System and method for panoramic imaging
CN104773296B (zh) * 2015-04-10 2017-01-18 武汉科技大学 一种空中实时跟踪拍摄微型无人机
WO2017008206A1 (en) * 2015-07-10 2017-01-19 SZ DJI Technology Co., Ltd. Dual lens system having a light splitter
CN105004337B (zh) * 2015-08-19 2017-12-08 青岛科技大学 基于直线匹配的农用无人机自主导航方法
WO2017071143A1 (en) * 2015-10-30 2017-05-04 SZ DJI Technology Co., Ltd. Systems and methods for uav path planning and control
CN108139758A (zh) * 2015-10-09 2018-06-08 深圳市大疆创新科技有限公司 基于显著性特征的载运工具定位
CN105447853B (zh) * 2015-11-13 2018-07-13 深圳市道通智能航空技术有限公司 飞行装置、飞行控制***及方法
US9896205B1 (en) * 2015-11-23 2018-02-20 Gopro, Inc. Unmanned aerial vehicle with parallax disparity detection offset from horizontal
WO2018005882A1 (en) * 2016-06-30 2018-01-04 Unmanned Innovation, Inc. Unmanned aerial vehicle wind turbine inspection systems and methods
US10520943B2 (en) * 2016-08-12 2019-12-31 Skydio, Inc. Unmanned aerial image capture platform
US10049589B1 (en) * 2016-09-08 2018-08-14 Amazon Technologies, Inc. Obstacle awareness based guidance to clear landing space
US11295458B2 (en) * 2016-12-01 2022-04-05 Skydio, Inc. Object tracking by an unmanned aerial vehicle using visual sensors
US11017679B2 (en) * 2017-01-13 2021-05-25 Skydio, Inc. Unmanned aerial vehicle visual point cloud navigation

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6804607B1 (en) * 2001-04-17 2004-10-12 Derek Wood Collision avoidance system and method utilizing variable surveillance envelope
US20110184593A1 (en) * 2006-04-19 2011-07-28 Swope John M System for facilitating control of an aircraft
CN101554925A (zh) * 2009-05-11 2009-10-14 董韬 一种无人机正射影像云台
CN202075794U (zh) * 2011-05-24 2011-12-14 段连飞 一种无人机航摄立体成像处理设备
CN103941746A (zh) * 2014-03-29 2014-07-23 国家电网公司 无人机巡检图像处理***及方法
CN104656664A (zh) * 2015-02-16 2015-05-27 南京航空航天大学 车载多旋翼无人直升机着陆导引控制***及导引控制方法

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109923583A (zh) * 2017-07-07 2019-06-21 深圳市大疆创新科技有限公司 一种姿态的识别方法、设备及可移动平台
CN108073184A (zh) * 2017-11-27 2018-05-25 天脉聚源(北京)传媒科技有限公司 无人机飞行控制方法及装置
CN108073184B (zh) * 2017-11-27 2024-02-20 北京拉近众博科技有限公司 无人机飞行控制方法及装置
CN109410140A (zh) * 2018-10-24 2019-03-01 京东方科技集团股份有限公司 一种畸变校正方法、装置、***及计算机可读存储介质
CN109410140B (zh) * 2018-10-24 2021-02-26 京东方科技集团股份有限公司 一种畸变校正方法、装置、***及计算机可读存储介质
WO2020128944A1 (es) * 2018-12-19 2020-06-25 Pontificia Universidad Javeriana Método de detección de líneas de transmisión de energía en tiempo real con vehículos aéreos no tripulados
CN109901202A (zh) * 2019-03-18 2019-06-18 成都希德瑞光科技有限公司 一种基于点云数据的机载***位置修正方法
CN112016570A (zh) * 2019-12-12 2020-12-01 天目爱视(北京)科技有限公司 用于背景板同步旋转采集中的三维模型生成方法
CN112016570B (zh) * 2019-12-12 2023-12-26 天目爱视(北京)科技有限公司 用于背景板同步旋转采集中的三维模型生成方法
CN111540003A (zh) * 2020-04-27 2020-08-14 浙江光珀智能科技有限公司 一种深度图像的生成方法及装置
CN112650298A (zh) * 2020-12-30 2021-04-13 广东工业大学 一种无人机追踪降落方法及***

Also Published As

Publication number Publication date
CN105346706B (zh) 2018-09-04
CN105346706A (zh) 2016-02-24
US20170308103A1 (en) 2017-10-26
US10234873B2 (en) 2019-03-19

Similar Documents

Publication Publication Date Title
WO2017080108A1 (zh) 飞行装置、飞行控制***及方法
WO2017080102A1 (zh) 飞行装置、飞行控制***及方法
US9578310B2 (en) Automatic scene calibration
CN106960454B (zh) 景深避障方法、设备及无人飞行器
CN107357286A (zh) 视觉定位导航装置及其方法
CN108731587A (zh) 一种基于视觉的无人机动态目标跟踪与定位方法
WO2018046617A1 (en) Method and system for calibrating multiple cameras
CN106033614B (zh) 一种强视差下的移动相机运动目标检测方法
CN110514212A (zh) 一种融合单目视觉和差分gnss的智能车地图地标定位方法
CN110998241A (zh) 用于校准可移动对象的光学***的***和方法
KR101255461B1 (ko) 도로시설물 자동 위치측정 방법
KR101203816B1 (ko) 로봇 물고기 위치 인식 시스템 및 로봇 물고기 위치 인식 방법
JP2019056629A (ja) 距離推定装置及び方法
CN106846385B (zh) 基于无人机的多传感遥感影像匹配方法、装置和***
CN111105467A (zh) 一种图像标定方法、装置及电子设备
JP6860445B2 (ja) 物体距離検出装置
CN117406234A (zh) 基于单线激光雷达与视觉融合的目标测距和追踪的方法
CN110989645A (zh) 一种基于复眼成像原理的目标空间姿态处理方法
CN102542563A (zh) 一种移动机器人前向单目视觉的建模方法
CN112985388B (zh) 基于大位移光流法的组合导航方法及***
CN114554030A (zh) 设备检测***以及设备检测方法
JP2019164837A (ja) 情報処理システム、情報処理方法及び情報処理プログラム
US20220300744A1 (en) Information processing device and information processing method
KR101823657B1 (ko) 이종 카메라를 위한 스테레오 영상 교정 방법
Dunau Exploitation of gps control points in low-contrast ir imagery for homography estimation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16863299

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16863299

Country of ref document: EP

Kind code of ref document: A1