CN116952229A - Unmanned aerial vehicle positioning method, device, system and storage medium - Google Patents

Unmanned aerial vehicle positioning method, device, system and storage medium Download PDF

Info

Publication number
CN116952229A
CN116952229A CN202311102665.9A CN202311102665A CN116952229A CN 116952229 A CN116952229 A CN 116952229A CN 202311102665 A CN202311102665 A CN 202311102665A CN 116952229 A CN116952229 A CN 116952229A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
determining
optical flow
angular velocity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311102665.9A
Other languages
Chinese (zh)
Inventor
任雪峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhuoyi Intelligent Technology Co Ltd
Original Assignee
Beijing Zhuoyi Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhuoyi Intelligent Technology Co Ltd filed Critical Beijing Zhuoyi Intelligent Technology Co Ltd
Priority to CN202311102665.9A priority Critical patent/CN116952229A/en
Publication of CN116952229A publication Critical patent/CN116952229A/en
Pending legal-status Critical Current

Links

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application discloses an unmanned aerial vehicle positioning method, device, system and storage medium, wherein the method comprises the following steps: acquiring multi-frame continuous images of the unmanned aerial vehicle, and determining optical flow characteristics of the unmanned aerial vehicle according to the multi-frame continuous images; according to the data of the inertial measurement unit of the unmanned aerial vehicle, the acceleration and the angular velocity of the unmanned aerial vehicle are determined, and the Euler angle is determined based on the angular velocity; inputting the optical flow characteristics, the acceleration, the angular velocity and the Euler angle into a preset estimation model based on an optical flow equation and a kinematic equation; and determining the altitude and the speed of the unmanned aerial vehicle based on the output result of the estimation model. The technical scheme has the advantages that the efficiency of positioning the unmanned aerial vehicle in flight is high, the estimation precision is greatly improved, and the effective verification of the unmanned aerial vehicle platform is realized.

Description

Unmanned aerial vehicle positioning method, device, system and storage medium
Technical Field
The application relates to the technical field of unmanned aerial vehicles, in particular to an unmanned aerial vehicle positioning method, device, system and storage medium.
Background
The unmanned aerial vehicle autonomous positioning method based on the optical flow sensor positioning algorithm is a common unmanned aerial vehicle autonomous positioning method with low cost. Under the condition that a GPS is not needed, an optical flow sensor is installed at the bottom of the unmanned aerial vehicle to collect image data in real time, and based on the apparent speed of brightness patterns in the image data, displacement of interval images is calculated through an optical flow algorithm, so that autonomous positioning is realized. However, the positioning by the optical flow only has the problems of large error, low efficiency, inaccurate positioning and the like.
Disclosure of Invention
The present application has been made in view of the above-mentioned problems, and it is therefore an object of the present application to provide a method, apparatus, system and storage medium for unmanned aerial vehicle positioning that overcomes or at least partially solves the above-mentioned problems.
According to one aspect of the present application, there is provided a method of unmanned aerial vehicle positioning, the method comprising:
acquiring multi-frame continuous images of the unmanned aerial vehicle, and determining optical flow characteristics of the unmanned aerial vehicle according to the multi-frame continuous images;
according to the data of the inertial measurement unit of the unmanned aerial vehicle, the acceleration and the angular velocity of the unmanned aerial vehicle are determined, and the Euler angle is determined based on the angular velocity;
inputting the optical flow characteristics, the acceleration, the angular velocity and the Euler angle into a preset estimation model based on an optical flow equation and a kinematic equation;
and determining the altitude and the speed of the unmanned aerial vehicle based on the output result of the estimation model.
In some embodiments, acquiring multiple frames of continuous images of the unmanned aerial vehicle, determining optical flow characteristics of the unmanned aerial vehicle from the multiple frames of continuous images comprises:
determining a speed of a single pixel in the image plane based on the intensity of the image pixel;
the speed of movement of the feature in the image is determined based on the speeds of the plurality of pixels.
In some embodiments, the step of constructing the preset estimation model based on the optical flow equation and the kinematic equation includes:
based on the characteristics of the image plane, constructing an estimation model comprising the unmanned aerial vehicle height parameter and the unmanned aerial vehicle speed parameter;
respectively determining expressions of angular speed, linear speed, height and Euler angle of the unmanned aerial vehicle in each coordinate system according to an inertial coordinate system, a body coordinate system and a camera coordinate system;
based on the optical flow characteristics, the acceleration, the angular velocity and the real-time value of the Euler angle and each expression, carrying out state estimation on the estimation model to obtain the estimated value of the unmanned plane height and the speed.
In some embodiments, determining the expressions of the angular velocity, the linear velocity, and the altitude of the drone in each coordinate system, respectively, further includes:
and converting the expression under a certain coordinate system into other different coordinate systems to obtain the expression under the other coordinate systems.
In some embodiments, performing state estimation on the estimation model includes:
representing the estimation model with a jacobian matrix based on each of the expressions and/or time derivatives of each expression;
based on the jacobian matrix, a state estimation is performed by using an extended Kalman filter.
In some embodiments, the method further comprises:
the inertial measurement unit of the unmanned aerial vehicle is pre-calibrated to determine the scale factors, bias and/or misalignment errors of the gyroscopes and accelerometers.
In some embodiments, determining the euler angle based on the angular velocity comprises:
and integrating the angular speed based on an algorithm of a gesture and heading reference system to determine the Euler angle.
According to another aspect of the present application, there is provided a drone positioning device, the device comprising:
an optical flow feature determination module; the method comprises the steps of being suitable for obtaining multi-frame continuous images of the unmanned aerial vehicle, and determining optical flow characteristics of the unmanned aerial vehicle according to the multi-frame continuous images;
the inertial characteristic determining module is suitable for determining the acceleration and the angular velocity of the unmanned aerial vehicle according to the data of the inertial measurement unit of the unmanned aerial vehicle, and determining the Euler angle based on the angular velocity;
the model estimation module is suitable for inputting the optical flow characteristics, the acceleration, the angular velocity and the Euler angle into a preset estimation model based on an optical flow equation and a kinematic equation;
and the result determining module is suitable for determining the altitude and the speed of the unmanned aerial vehicle based on the estimation result of the estimation model.
According to yet another aspect of the present application, there is provided a unmanned aerial vehicle positioning system comprising: the unmanned aerial vehicle comprises an unmanned aerial vehicle platform, an image sensor and an airborne image processing board, wherein the image sensor and the airborne image processing board are arranged on the unmanned aerial vehicle platform;
the on-board image processing board comprises a processor and a memory arranged to store computer executable instructions which, when executed, cause the processor to perform the unmanned aerial vehicle positioning method according to any of the above embodiments.
According to a further aspect of the present application, there is provided a computer readable storage medium storing one or more programs which, when executed by a processor, implement a drone positioning method according to any one of the above.
From the above, according to the above technical solution disclosed in the present application, the speed vector and the vertical distance of the unmanned aerial vehicle UAV are estimated by using IMU measurement data and the image sequence captured by the camera, which is particularly suitable for emergency situations and GPS rejection environments. In the estimation process, the technical scheme combines the optical flow equation of the tracked characteristic with the kinematic equation of the UAV motion, and provides the multi-characteristic tracking capability so as to improve the estimation performance.
Also, the proposed method combines UAV dynamics in the image plane with detected feature dynamics to estimate velocity vectors and vertical distance independent of distance measurement sensors, such as laser rangefinders or ultrasonic sensors.
Further, the above-described method requires only the unmanned aerial vehicle image, angular velocity and acceleration of the UAV, which can be obtained from the image sensor and the flight controller (IMU) of the UAV, improving efficiency and reducing equipment and operating costs.
The foregoing description is only an overview of the present application, and is intended to be implemented in accordance with the teachings of the present application in order that the same may be more clearly understood and to make the same and other objects, features and advantages of the present application more readily apparent.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the application. Also, like reference numerals are used to designate like parts throughout the figures. In the drawings:
fig. 1 shows a flow diagram of a method of drone positioning according to one embodiment of the application;
FIG. 2 illustrates a schematic diagram of a system state model, according to one embodiment of the application;
FIG. 3 shows a system diagram of a conversion from an entity coordinate system to a feature coordinate system, according to one embodiment of the application;
fig. 4 shows a schematic structural view of a positioning device for a drone according to an embodiment of the present application;
fig. 5 shows a schematic structural diagram of a drone platform according to one embodiment of the application;
fig. 6 shows a schematic structural diagram of an on-board image processing board according to an embodiment of the present application.
Detailed Description
Exemplary embodiments of the present application will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present application are shown in the drawings, it should be understood that the present application may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the application to those skilled in the art.
Fig. 1 shows a flow diagram of a method for positioning a drone according to one embodiment of the application, comprising the steps of:
step S110, acquiring multi-frame continuous images of the unmanned aerial vehicle, and determining optical flow characteristics of the unmanned aerial vehicle according to the multi-frame continuous images;
step S120, according to the data of an inertial measurement unit of the unmanned aerial vehicle, determining the acceleration and the angular velocity of the unmanned aerial vehicle, and determining the Euler angle based on the angular velocity;
step S130, inputting the optical flow characteristics, the acceleration, the angular velocity and the Euler angle into a preset estimation model based on an optical flow equation and a kinematic equation;
and step S140, determining the altitude and the speed of the unmanned aerial vehicle based on the output result of the estimation model.
According to the technical scheme disclosed by the application, the speed vector of the UAV and the distance between the UAV and the ground are estimated by using IMU measurement data and an image sequence shot by a camera, and the method is particularly suitable for emergency situations and GPS rejection environments. In the estimation process, the technical scheme combines the optical flow equation of the tracked characteristic with the kinematic equation of the UAV, and provides the multi-characteristic tracking capability so as to improve the estimation performance.
In some embodiments, acquiring multiple frames of continuous images of the unmanned aerial vehicle, determining optical flow characteristics of the unmanned aerial vehicle from the multiple frames of continuous images comprises:
determining a speed of a single pixel in the image plane based on the intensity of the image pixel;
the speed of movement of the feature in the image is determined based on the speeds of the plurality of pixels.
Specifically, the feature position is extracted from the image captured by the camera using the concept of optical flow. This concept shows that if the brightness between two consecutive images does not change, the intensity p= [ X ] of the pixel representing the feature point p Y p Z p ]And remain constant. The formula is as follows:
wherein I is the intensity of the image, [ x y ]]Representing the projection of point P onto the image plane [ u v ]]The pixel speed of the feature detected in the image plane is determined. The method using expression (1) includes applying a Sobel filter to calculateAndfurther, terms u and v defined in expression (2) are calculated to track features in the image. Since a single pixel does not have sufficient information, the feature movement in pixel coordinates is calculated taking into account the n neighbors of the desired pixel. Thus, the expression for n neighbors can be written as:
-I t (x 1 ,y 1 )=I x (x1,y 1 ).u+I y (x 1 ,y 1 ).υ
-I t (x 2 ,y 2 )=I x (x 2 ,y 2 ).u+I y (x 2 ,y 2 ).v
-I t (x 3 ,y 3 )=I x (x 3 ,y 3 ).u+I y (x 3 ,y 3 ).v
-I t (x n ,y n )=I x (x n ,y n ).u+I y (x n ,y n ).v (2)
by rewriting expression (2) in matrix form:
expression (3) with n equations and 2 unknowns can be solved by applying a conventional algorithm such as Least Squares (LS) method, as follows:
in addition, intrinsic and extrinsic parameters of the camera may also be determined by existing algorithms.
The acceleration vector and the angular velocity vector are measured by an IMU mounted on the drone. Accordingly:
wherein, the liquid crystal display device comprises a liquid crystal display device,and->Respectively measured angular velocity and acceleration, +.>Is the actual angular velocity of the wheel,is the linear acceleration within the unmanned aerial vehicle fuselage frame. Deviation->And->The calibration phase prior to flight is eliminated. Optionally, the IMU needs to be calibrated prior to testing to determine the scale factors, bias and misalignment errors of the gyroscope and accelerometer. Furthermore, measurement noise->And->Modeled as gaussian white noise.
In some embodiments, the step of constructing the preset estimation model based on the optical flow equation and the kinematic equation includes:
based on the characteristics of the image plane, constructing an estimation model comprising the unmanned aerial vehicle height parameter and the unmanned aerial vehicle speed parameter;
respectively determining expressions of angular speed, linear speed, height and Euler angle of the unmanned aerial vehicle in each coordinate system according to an inertial coordinate system, a body coordinate system and a camera coordinate system;
based on the optical flow characteristics, the acceleration, the angular velocity and the real-time value of the Euler angle and each expression, carrying out state estimation on the estimation model to obtain the estimated value of the unmanned plane height and the speed.
In some embodiments, determining the expressions of the angular velocity, the linear velocity, and the altitude of the drone in each coordinate system, respectively, further includes:
and converting the expression under a certain coordinate system into other different coordinate systems to obtain the expression under the other coordinate systems.
In some embodiments, performing state estimation on the estimation model includes:
representing the estimation model with a jacobian matrix based on each of the expressions and/or time derivatives of each expression;
based on the jacobian matrix, a state estimation is performed by using an extended Kalman filter.
In some embodiments, the method further comprises:
the inertial measurement unit of the unmanned aerial vehicle is pre-calibrated to determine the scale factors, bias and/or misalignment errors of the gyroscopes and accelerometers.
In some embodiments, determining the euler angle based on the angular velocity comprises:
and integrating the angular speed based on an algorithm of a gesture and heading reference system to determine the Euler angle.
Specifically, the above embodiments provide an estimation model formula required to estimate altitude and speed vectors of a quadrotor using image data. Wherein an optical flow formula is used to show how the height and velocity vectors affect the position of the detected feature in the camera plane. Therefore, the system state that should be estimated is regarded asWherein->Is the i-th feature projected on the camera plane,/->Is the translational velocity of the UAV in the inertial frame and Z is the Z component of the UAV position in the inertial frame relative to the ground. It is desirable to formulate the dynamics and measurement equations as:
where F is a nonlinear time-varying equation,is an actuation of the system, t represents time, +.>Is the process noise modeled as white noise with a power spectral density Q. Measurement value->The measured noise delta, represented by h and power spectral density R, is represented by a nonlinear equation. A schematic diagram of the model used in the present application is shown in fig. 2.
Thus, three coordinate systems are given: inertial coordinate system (X) I Y I Z I ) Frame coordinate system (X B Y B Z B ) And a camera coordinate system (X C Y C Z C )。Position and function of->In (X) I Y I Z I ) While the IMU measures (X B Y B Z B ). Furthermore, the camera captures an image in a camera coordinate system determined by the xy plane. Without loss of generality, assume X C Y C Z C Parallel to X B Y B Z B The center of the body frame is located at f (focal length) from the camera focus, and the center of the camera frame is at the center of the image plane:
considering the kinematic equations, and assuming the speed of a quadrotor in the fuselage frame, the speed of the fixed point on the inertial axis can be expressed within the fuselage frame as:
wherein the method comprises the steps ofIs the pixel location of feature i in the camera plane. The angular velocity and the linear velocity of the unmanned aerial vehicle are replaced by +.>And->And considering the geometric equation, the optical flow equation can be written as:
where f is the focal length of the camera. The expression (10) is rearranged in a matrix form as:
where d is the distance between the camera and the feature. According to expression (11), having the angular velocity of the camera and the position and velocity of the detected feature enables calculation of the pressA speed vector of the scaled camera. This parameter, called optical flow, can be used as a measure of estimated position and velocity. Thus, when the image sensor closes a feature, the rate of change of pixel position increases with the speed and distance between the sensor and the feature.
Parameter d i The relationship between the system state and the system state may be determined according to fig. 3. This coordinate axis is called the characteristic coordinate axisThe origin is defined at the center of the camera, defined by
1. Rotating machine body coordinate system (X) B Y B Z B ) Around the y-axis
2. Rotating the body coordinate system about the x-axis
The z-direction of the feature coordinate system is oriented towards the detected feature. Thus, the detected features may be represented in the fuselage frame as:
wherein the method comprises the steps ofIs i th The position of the feature in the fuselage frame, +.>Is the position of the same feature in the feature coordinate axis. Finally, the characteristic positions in the inertial frame are:
here, theIs a transformation matrix from the fuselage frame to the inertial frame, which is derived using Euler angles> The following is shown:
wherein c (-) and s (-) represent cos (-) and sin (-) functions, respectively. VectorAnd->Inserted into expression (13), the result is
(Vector)The third component of (c) is the height of MUAV (Z). Thus (2)
Z=a 33 (x i ,y i )d′ i (16)
Using expression (12), d i By combining vectorsTransfer to the fuselage coordinate system for calculation. The time derivative of the speed and altitude of the UAV can be expressed as:
expressions (11), (18) and (19) are nonlinear time derivatives of the system state defined in the form of expression (7). To apply an Extended Kalman Filter (EKF) to the state estimation, a jacobian matrix of equation (9) is applied to represent a linearized version of the state equation, as follows:
wherein matrix Γ i For each featureThe definition is as follows:
furthermore, the matrix ψ i Parameter theta i The definition is as follows:
applying L-K feature detection techniques to images captured by camerasIt can be determined. Therefore, the measurement equation based on expression (8) can be calculated as:
to use EKF for state estimation, expression (24) may be rearranged as:
since the system state equation is nonlinear, the EKF is used to estimate the state of the system. Assume that the procedure controlled by the nonlinear stochastic differential equation and measurement is as follows:
wherein the method comprises the steps ofIs a state vector,/->Is the control vector, wk and vk are the process noise and the measurement noise, respectively. Process noise is mainly related to the measurement of acceleration and angular velocity, whereas measurement noise reflects the uncertainty associated with the image detection features. It is assumed that itThey are white noise with a normal probability distribution, as follows:
p(w)=N(0,Q),p(v)=N(0,R) (28)
where Q is the 6 x 6 covariance matrix of the process noise, reflecting mainly the uncertainty associated with the system drive, and R is the 2 x2 covariance matrix of the measurement noise, specifying the uncertainty associated with the measurement. Thus, the process and measurement noise covariance matrix is considered to be Q 0 =10 -8 ×I 6×6 And R is 0 =c×I 2×2 Where c is the length of the pixel on the image plane, which in the present application is considered to be c=3.35×10 in relation to the camera applied -6 (m)。
According to another aspect of the present application, referring to fig. 4, there is provided a positioning device for a drone, the device 400 including:
an optical flow feature determination module 410; the method comprises the steps of being suitable for obtaining multi-frame continuous images of the unmanned aerial vehicle, and determining optical flow characteristics of the unmanned aerial vehicle according to the multi-frame continuous images;
the inertial feature determining module 420 is adapted to determine acceleration and angular velocity of the unmanned aerial vehicle according to data of an inertial measurement unit of the unmanned aerial vehicle, and determine euler angles based on the angular velocity;
the model estimation module 430 is adapted to input the optical flow characteristics, acceleration, angular velocity and euler angle into a preset estimation model based on an optical flow equation and a kinematic equation;
the result determination module 440 is adapted to determine the altitude and the speed of the drone based on the estimation result of the estimation model.
In some embodiments, the optical flow feature determination module 410 is adapted to:
determining a speed of a single pixel in the image plane based on the intensity of the image pixel;
the speed of movement of the feature in the image is determined based on the speeds of the plurality of pixels.
In some embodiments, the step of constructing the preset estimation model based on the optical flow equation and the kinematic equation includes:
based on the characteristics of the image plane, constructing an estimation model comprising the unmanned aerial vehicle height parameter and the unmanned aerial vehicle speed parameter;
respectively determining expressions of angular speed, linear speed, height and Euler angle of the unmanned aerial vehicle in each coordinate system according to an inertial coordinate system, a body coordinate system and a camera coordinate system;
based on the optical flow characteristics, the acceleration, the angular velocity and the real-time value of the Euler angle and each expression, carrying out state estimation on the estimation model to obtain the estimated value of the unmanned plane height and the speed.
In some embodiments, determining the expressions of the angular velocity, the linear velocity, and the altitude of the drone in each coordinate system, respectively, further includes:
and converting the expression under a certain coordinate system into other different coordinate systems to obtain the expression under the other coordinate systems.
In some embodiments, performing state estimation on the estimation model includes:
representing the estimation model with a jacobian matrix based on each of the expressions and/or time derivatives of each expression;
based on the jacobian matrix, a state estimation is performed by using an extended Kalman filter.
In some embodiments, the apparatus is further adapted to:
the inertial measurement unit of the unmanned aerial vehicle is pre-calibrated to determine the scale factors, bias and/or misalignment errors of the gyroscopes and accelerometers.
In some implementations, the inertial feature determination module 420 is further adapted to:
and integrating the angular speed based on an algorithm of a gesture and heading reference system to determine the Euler angle.
It should be noted that, the specific implementation manner of each embodiment of the apparatus may be performed with reference to the specific implementation manner of the corresponding embodiment of the method, which is not described herein.
According to yet another aspect of the present application, there is provided a unmanned aerial vehicle positioning system comprising: unmanned aerial vehicle platform, image sensor and airborne image processing board.
Optionally, in a specific embodiment, referring to fig. 5, the unmanned plane platform may select a femto X450 unmanned plane 501, and mount an Intel D435i binocular depth camera as the image sensor 502. The front of the D435i camera is provided with four round holes, and the first and the third are infrared sensors (IR Stereo Cameral) from left to right; the second is an infrared laser emitter (IR Projector) and the fourth is a color camera (color sensor). The camera captures the furthest distance of up to 10 meters and the video transmission rate of up to 90fps.
Optionally, in a specific embodiment, the on-board image processing board includes a TX2 embedded platform facing the unmanned intelligent domain, which is proposed by Nvidia corporation, where the GPU adopts Nvidia Pascal TM Architecture with 256 CUDA cores. The CPU has 6 cores, is respectively composed of a dual-core Denver2 processor and a quad-core ARM Cortex-A57, has strong performance and small appearance, and is very suitable for intelligent edge equipment such as robots, unmanned aerial vehicles, intelligent cameras and the like.
It should be noted that:
the algorithms and displays presented herein are not inherently related to any particular computer, virtual machine, or other apparatus. Various general purpose devices may also be used with the teachings herein. The required structure for the construction of such devices is apparent from the description above. In addition, the present application is not directed to any particular programming language. It will be appreciated that the teachings of the present application described herein may be implemented in a variety of programming languages, and the above description of specific languages is provided for disclosure of enablement and best mode of the present application.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the application may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the above description of exemplary embodiments of the application, various features of the application are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be construed as reflecting the intention that: i.e., the claimed application requires more features than are expressly recited in each claim.
Those skilled in the art will appreciate that the modules in the apparatus of the embodiments may be adaptively changed and disposed in one or more apparatuses different from the embodiments. The modules or units or components of the embodiments may be combined into one module or unit or component and, furthermore, they may be divided into a plurality of sub-modules or sub-units or sub-components. Any combination of all features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or units of any method or apparatus so disclosed, may be used in combination, except insofar as at least some of such features and/or processes or units are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features but not others included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the application and form different embodiments.
Various component embodiments of the application may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that some or all of the functions of some or all of the components in a drone positioning device according to embodiments of the application may be implemented in practice using a microprocessor or Digital Signal Processor (DSP). The present application can also be implemented as an apparatus or device program (e.g., a computer program and a computer program product) for performing a portion or all of the methods described herein. Such a program embodying the present application may be stored on a computer readable medium, or may have the form of one or more signals. Such signals may be downloaded from an internet website, provided on a carrier signal, or provided in any other form.
The embodiment of the application provides a non-volatile computer storage medium, which stores at least one executable instruction, and the computer executable instruction can execute the unmanned aerial vehicle positioning method in any of the method embodiments.
Fig. 6 shows a schematic structural diagram of an embodiment of the on-board image processing board of the present application, and the embodiment of the present application is not limited to the specific implementation of the on-board image processing board.
As shown in fig. 6, the on-board image processing board may include: a processor 602, a communication interface (Communications Interface), a memory 606, and a communication bus 608.
Wherein: processor 602, communication interface 604, and memory 606 perform communication with each other via communication bus 608. Communication interface 604 is used to communicate with network elements of other devices, such as clients or other servers. The processor 602 is configured to execute the program 610, and may specifically perform relevant steps in the above-described embodiment of the unmanned aerial vehicle positioning method for the on-board image processing board.
In particular, program 610 may include program code including computer-operating instructions.
The processor 602 may be a central processing unit CPU or a specific integrated circuit ASIC (Application Specific Integrated Circuit) or one or more integrated circuits configured to implement embodiments of the present application. The one or more processors included in the on-board image processing board may be the same type of processor, such as one or more CPUs; but may also be different types of processors such as one or more CPUs and one or more ASICs.
A memory 606 for storing a program 610. The memory 606 may comprise high-speed RAM memory or may further comprise non-volatile memory (non-volatile memory), such as at least one disk memory.
The program 610 may be specifically configured to cause the processor 602 to perform operations corresponding to the above-described embodiments of the unmanned aerial vehicle positioning method.
It should be noted that the above-mentioned embodiments illustrate rather than limit the application, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The application may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as names.

Claims (10)

1. A method of unmanned aerial vehicle positioning, the method comprising:
acquiring multi-frame continuous images of the unmanned aerial vehicle, and determining optical flow characteristics of the unmanned aerial vehicle according to the multi-frame continuous images;
according to the data of the inertial measurement unit of the unmanned aerial vehicle, the acceleration and the angular velocity of the unmanned aerial vehicle are determined, and the Euler angle is determined based on the angular velocity;
inputting the optical flow characteristics, the acceleration, the angular velocity and the Euler angle into a preset estimation model based on an optical flow equation and a kinematic equation;
and determining the altitude and the speed of the unmanned aerial vehicle based on the output result of the estimation model.
2. The method of claim 1, wherein acquiring a plurality of frames of sequential images of the unmanned aerial vehicle, and determining optical flow characteristics of the unmanned aerial vehicle from the plurality of frames of sequential images comprises:
determining a speed of a single pixel in the image plane based on the intensity of the image pixel;
the speed of movement of the feature in the image is determined based on the speeds of the plurality of pixels.
3. The method according to claim 1, wherein the step of constructing a predetermined estimation model based on an optical flow equation and a kinematic equation includes:
based on the characteristics of the image plane, constructing an estimation model comprising the unmanned aerial vehicle height parameter and the unmanned aerial vehicle speed parameter;
respectively determining expressions of angular speed, linear speed, height and Euler angle of the unmanned aerial vehicle in each coordinate system according to an inertial coordinate system, a body coordinate system and a camera coordinate system;
based on the optical flow characteristics, the acceleration, the angular velocity and the real-time value of the Euler angle and each expression, carrying out state estimation on the estimation model to obtain the estimated value of the unmanned plane height and the speed.
4. A method according to claim 3, wherein determining the expressions of the angular velocity, the linear velocity and the altitude of the drone in the respective coordinate systems, respectively, further comprises:
and converting the expression under a certain coordinate system into other different coordinate systems to obtain the expression under the other coordinate systems.
5. A method according to claim 3, wherein performing state estimation on the estimation model comprises:
representing the estimation model with a jacobian matrix based on each of the expressions and/or time derivatives of each of the expressions;
based on the jacobian matrix, a state estimation is performed by using an extended Kalman filter.
6. The method according to any one of claims 1-5, further comprising:
the inertial measurement unit of the unmanned aerial vehicle is pre-calibrated to determine the scale factors, bias and/or misalignment errors of the gyroscopes and accelerometers.
7. The method of any one of claims 1-5, wherein determining an euler angle based on the angular velocity comprises:
and integrating the angular speed based on an algorithm of a gesture and heading reference system to determine the Euler angle.
8. An unmanned aerial vehicle positioning device, the device comprising:
an optical flow feature determination module; the method comprises the steps of being suitable for obtaining multi-frame continuous images of the unmanned aerial vehicle, and determining optical flow characteristics of the unmanned aerial vehicle according to the multi-frame continuous images;
the inertial characteristic determining module is suitable for determining the acceleration and the angular velocity of the unmanned aerial vehicle according to the data of the inertial measurement unit of the unmanned aerial vehicle, and determining the Euler angle based on the angular velocity;
the model estimation module is suitable for inputting the optical flow characteristics, the acceleration, the angular velocity and the Euler angle into a preset estimation model based on an optical flow equation and a kinematic equation;
and the result determining module is suitable for determining the altitude and the speed of the unmanned aerial vehicle based on the estimation result of the estimation model.
9. An unmanned aerial vehicle positioning system, comprising: the unmanned aerial vehicle comprises an unmanned aerial vehicle platform, an image sensor and an airborne image processing board, wherein the image sensor and the airborne image processing board are arranged on the unmanned aerial vehicle platform;
the on-board image processing board comprising a processor and a memory arranged to store computer executable instructions which, when executed, cause the processor to perform the unmanned aerial vehicle positioning method according to any of claims 1-7.
10. A computer readable storage medium, characterized in that it stores one or more programs, which when executed by a processor, implement the unmanned aerial vehicle positioning method according to any of claims 1-7.
CN202311102665.9A 2023-08-29 2023-08-29 Unmanned aerial vehicle positioning method, device, system and storage medium Pending CN116952229A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311102665.9A CN116952229A (en) 2023-08-29 2023-08-29 Unmanned aerial vehicle positioning method, device, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311102665.9A CN116952229A (en) 2023-08-29 2023-08-29 Unmanned aerial vehicle positioning method, device, system and storage medium

Publications (1)

Publication Number Publication Date
CN116952229A true CN116952229A (en) 2023-10-27

Family

ID=88449385

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311102665.9A Pending CN116952229A (en) 2023-08-29 2023-08-29 Unmanned aerial vehicle positioning method, device, system and storage medium

Country Status (1)

Country Link
CN (1) CN116952229A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117739972A (en) * 2024-02-18 2024-03-22 中国民用航空飞行学院 Unmanned aerial vehicle approach stage positioning method without global satellite positioning system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117739972A (en) * 2024-02-18 2024-03-22 中国民用航空飞行学院 Unmanned aerial vehicle approach stage positioning method without global satellite positioning system
CN117739972B (en) * 2024-02-18 2024-05-24 中国民用航空飞行学院 Unmanned aerial vehicle approach stage positioning method without global satellite positioning system

Similar Documents

Publication Publication Date Title
US20210012520A1 (en) Distance measuring method and device
CN111156998B (en) Mobile robot positioning method based on RGB-D camera and IMU information fusion
Panahandeh et al. Vision-aided inertial navigation based on ground plane feature detection
CN109885080B (en) Autonomous control system and autonomous control method
TWI827649B (en) Apparatuses, systems and methods for vslam scale estimation
JP7404010B2 (en) Position estimation device and method
CN110873883B (en) Positioning method, medium, terminal and device integrating laser radar and IMU
US20200206945A1 (en) Robot pose estimation method and apparatus and robot using the same
US20180075614A1 (en) Method of Depth Estimation Using a Camera and Inertial Sensor
JP2019528501A (en) Camera alignment in a multi-camera system
CN112116651B (en) Ground target positioning method and system based on monocular vision of unmanned aerial vehicle
CN111279354A (en) Image processing method, apparatus and computer-readable storage medium
CN110824453A (en) Unmanned aerial vehicle target motion estimation method based on image tracking and laser ranging
CN113848931B (en) Agricultural machinery automatic driving obstacle recognition method, system, equipment and storage medium
CN109978954A (en) The method and apparatus of radar and camera combined calibrating based on cabinet
CN116952229A (en) Unmanned aerial vehicle positioning method, device, system and storage medium
CN115371665A (en) Mobile robot positioning method based on depth camera and inertia fusion
KR102288609B1 (en) Method and system for position estimation of unmanned aerial vehicle using graph structure based on multi module
JP2010020729A (en) Vehicle traveling locus observation system, vehicle traveling locus observation method and program
CN116659490A (en) Low cost vision-inertial fusion SLAM method
Klavins et al. Unmanned aerial vehicle movement trajectory detection in open environment
CN113301248B (en) Shooting method and device, electronic equipment and computer storage medium
WO2021111613A1 (en) Three-dimensional map creation device, three-dimensional map creation method, and three-dimensional map creation program
Ahmadinejad et al. A low-cost vision-based tracking system for position control of quadrotor robots
WO2022033139A1 (en) Ego-motion estimation method and related apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination