CN113077436A - Target position determining method and device based on aircraft - Google Patents

Target position determining method and device based on aircraft Download PDF

Info

Publication number
CN113077436A
CN113077436A CN202110344049.9A CN202110344049A CN113077436A CN 113077436 A CN113077436 A CN 113077436A CN 202110344049 A CN202110344049 A CN 202110344049A CN 113077436 A CN113077436 A CN 113077436A
Authority
CN
China
Prior art keywords
target
coordinate
preset
aircraft
ground area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110344049.9A
Other languages
Chinese (zh)
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Zhendi Intelligent Technology Co Ltd
Original Assignee
Suzhou Zhendi Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Zhendi Intelligent Technology Co Ltd filed Critical Suzhou Zhendi Intelligent Technology Co Ltd
Priority to CN202110344049.9A priority Critical patent/CN113077436A/en
Publication of CN113077436A publication Critical patent/CN113077436A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Quality & Reliability (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application provides a target position determining method and device based on an aircraft, the aircraft comprises an aircraft body, a monocular camera and a computing device, the monocular camera is arranged on the aircraft body to shoot a target, the computing device is electrically connected with the monocular camera, and the method comprises the following steps: acquiring first depth information of a preset ground area of the aircraft right below a first preset moment and a first coordinate of a first preset point of the preset ground area under a world coordinate system; calculating an inclination angle of the preset ground area relative to a horizontal plane according to the first depth information; acquiring a second coordinate of an optical center of the monocular camera in a world coordinate system at a first preset moment and a first image of a target shot by the monocular camera at the first preset moment, and acquiring a third coordinate of the target in the first image; and calculating the target coordinate of the target in the world coordinate system at the first preset moment according to the inclination angle, the first coordinate, the second coordinate and the third coordinate.

Description

Target position determining method and device based on aircraft
Technical Field
The application relates to the technical field of aircrafts, in particular to a target position determining method and device based on an aircraft.
Background
The existing target state estimation method generally predicts the state of a target in the next second based on the current state of the target and a uniform motion prediction model, then shoots the target by using a monocular camera to obtain a target image in the next second, determines the state of the target at the moment according to the target image, and finally updates the predicted state by using the state of the target at the moment so as to determine the position of the target.
In the above mode, because the monocular camera is used for shooting the target image, and the monocular camera cannot estimate the depth (ground height), the ground height (depth) is assumed to be unchanged when the target state is estimated in the existing method, but the ground height (depth) is changed at any time in an actual situation, so that the existing target state estimation method has the problem that the target state is estimated inaccurately.
Disclosure of Invention
An object of the embodiments of the present application is to provide an aircraft-based target location determining method, an aircraft-based target location determining device, an aircraft-based target location determining electronic device, and a storage medium, so as to solve the problem of inaccurate estimation existing in the existing target state estimation method.
In a first aspect, the present invention provides an aircraft-based target position determining method, where the aircraft includes an aircraft body, a monocular camera provided on the aircraft to photograph the target, and a computing device electrically connected to the monocular camera, and the method is applied to the computing device, and includes: acquiring first depth information of a preset ground area of the aircraft right below a first preset moment and a first coordinate of a first preset point of the preset ground area under a world coordinate system; calculating an inclination angle of the preset ground area relative to a horizontal plane according to the first depth information; acquiring a second coordinate of the optical center of the monocular camera in a world coordinate system at the first preset moment and a first image of the target shot by the monocular camera at the first preset moment, and acquiring a third coordinate of the target in the first image; and calculating the target coordinate of the target in the world coordinate system at the first preset moment according to the inclination angle, the first coordinate, the second coordinate and the third coordinate.
In the designed target position determining method based on the aircraft, according to the scheme of the application, first depth information of a preset ground area right below the aircraft is obtained, then an inclination angle (gradient) of the preset ground area relative to a horizontal plane is calculated based on the first depth information, a first coordinate of a first preset point of the preset ground area at a first preset time in a world coordinate system, a second coordinate of an optical center of a monocular camera at the first preset time in the world coordinate system and a third coordinate of a target in a first image shot by the monocular camera are obtained, and finally a target coordinate of the target at the first preset time in the world coordinate system is calculated based on the inclination angle, the first coordinate, the second coordinate and the third coordinate; therefore, when the target position is estimated, the slope of the ground area where the target is located is determined, and the real position of the target is estimated by combining the slope factor, so that the problem that the ground height (depth) is not changed when the target state is estimated in the existing method, but the ground height (depth) is changed at any time under the actual condition, which brings inaccuracy to the target state estimation is solved, and the accuracy of the target position estimation is improved.
In an optional implementation manner of the first aspect, the calculating an inclination angle of the preset ground area with respect to a horizontal plane according to the first depth information includes: acquiring a maximum depth value, a minimum depth value and distance values of the maximum depth value and the minimum depth value in the first depth information; and calculating the inclination angle of the preset ground area relative to the horizontal plane according to the maximum depth value, the minimum depth value and the distance value between the maximum depth value and the minimum depth value.
In an optional implementation manner of the first aspect, the calculating, according to the inclination angle, the first coordinate, the second coordinate, and the third coordinate, a target coordinate of the target in a world coordinate system at a first preset time includes: determining a first linear equation according to the inclination angle and the first coordinate; determining a second linear equation from the optical center of the monocular camera to the target according to the second coordinate and the third coordinate; and calculating the coordinates of the intersection point of the first linear equation and the second linear equation to obtain the coordinates of the target under the world coordinate system at the first preset moment.
In an optional implementation manner of the first aspect, the first depth information includes depth information of the first preset point, and after the calculating, according to the inclination angle, the first coordinate, the second coordinate, and the third coordinate, target coordinates of a target in a world coordinate system at a first preset time, the method further includes: acquiring second depth information of a second preset point of a preset ground area of the aircraft at a second preset time, wherein the second preset time is the time when the aircraft is located at the target coordinate; and correcting the target coordinate of the target under the world coordinate system at the first preset moment according to the second depth information of the second preset point.
In an optional implementation of the first aspect, the method further comprises: acquiring a first depth map containing the target; determining a fourth coordinate of the target in the first depth map according to the third coordinate of the target in the first image; and determining the coordinates of the target in a world coordinate system according to the fourth coordinates and the depth information of the target in the first depth map.
In an optional implementation manner of the first aspect, the aircraft includes a first binocular camera disposed on the aircraft body for shooting the target, the first binocular camera is electrically connected to the computing device, and the acquiring a first depth map including the target includes: acquiring a first binocular image of the target photographed by the first binocular camera; a first depth map containing the object is generated from the first binocular image of the object.
In the two embodiments of the above design, the first binocular image is taken of the target by the first binocular camera to obtain the first depth map, the fourth coordinate of the target in the first depth map is determined based on the third coordinate of the target in the first image, and the coordinate of the target in the world coordinate system can be obtained based on the fourth coordinate and the depth information of the target in the first depth map, so that it can be seen that the real position of the target in the terrain change process can be estimated by simply using the target depth map of the first binocular camera and the target image of the monocular camera, the problem that the ground height (depth) is assumed not changed when the target state is estimated in the existing method, and the ground height (depth) is changed at any time in the actual situation, which brings inaccuracy to the estimation of the target state is solved, and the accuracy of the estimation of the target position is improved.
In an optional implementation manner of the first aspect, the acquiring first depth information of a ground area of the aircraft directly below a first preset time includes: acquiring a second binocular image of the preset ground area shot by the second binocular camera; generating a second depth map of the preset ground area according to the second binocular image to obtain the first depth information.
In a second aspect, the present invention provides an aircraft-based target position determining apparatus, the aircraft including an aircraft body, a monocular camera provided on the aircraft to photograph the target, and a computing device electrically connected to the monocular camera, the apparatus being applied to the computing device, and including: the acquiring module is used for acquiring first depth information of a preset ground area of the aircraft right below a first preset moment and a first coordinate of a first preset point of the preset ground area in a world coordinate system; the calculation module is used for calculating the inclination angle of the preset ground area relative to the horizontal plane according to the first depth information; the acquiring module is further configured to acquire a second coordinate of the optical center of the monocular camera in the world coordinate system at the first preset time and a first image including the target, which is shot by the monocular camera at the first preset time, and acquire a third coordinate of the target in the first image; the calculation module is further configured to calculate a target coordinate of the target in the world coordinate system at a first preset time according to the inclination angle, the first coordinate, the second coordinate, and the third coordinate.
In the designed target position determining device based on the aircraft, according to the scheme of the application, first depth information of a preset ground area right below the aircraft is obtained, then an inclination angle (gradient) of the preset ground area relative to a horizontal plane is calculated based on the first depth information, a first coordinate of a first preset point of the preset ground area at a first preset time in a world coordinate system, a second coordinate of an optical center of a monocular camera at the first preset time in the world coordinate system and a third coordinate of a target in a first image shot by the monocular camera are obtained, and finally a target coordinate of the target at the first preset time in the world coordinate system is calculated based on the inclination angle, the first coordinate, the second coordinate and the third coordinate; therefore, when the target position is estimated, the slope of the ground area where the target is located is determined, and the real position of the target is estimated by combining the slope factor, so that the problem that the ground height (depth) is not changed when the target state is estimated in the existing method, but the ground height (depth) is changed at any time under the actual condition, which brings inaccuracy to the target state estimation is solved, and the accuracy of the target position estimation is improved.
In an optional implementation manner of the second aspect, the calculation module is specifically configured to obtain a maximum depth value, a minimum depth value, and a distance value between the maximum depth value and the minimum depth value in the first depth information; and calculating the inclination angle of the preset ground area relative to the horizontal plane according to the maximum depth value, the minimum depth value and the distance value between the maximum depth value and the minimum depth value.
In an optional implementation manner of the second aspect, the calculation module is further specifically configured to determine a first linear equation according to the inclination angle and the first coordinate; determining a second linear equation from the optical center of the monocular camera to the target according to the second coordinate and the third coordinate; and calculating the coordinates of the intersection point of the first linear equation and the second linear equation to obtain the coordinates of the target under the world coordinate system at the first preset moment.
In an optional implementation manner of the second aspect, the obtaining module is further configured to obtain second depth information of a second preset point of a preset ground area of the aircraft at a second preset time, where the second preset time is a time when the aircraft is at the target coordinate; and the correction module is used for correcting the target coordinate of the target under the world coordinate system at the first preset moment according to the second depth information of the second preset point.
In an optional implementation manner of the second aspect, the obtaining module is further configured to obtain a first depth map containing the target; a determining module, configured to determine a fourth coordinate of the target in the first depth map according to a third coordinate of the target in the first image; and determining the coordinates of the target in a world coordinate system according to the fourth coordinates and the depth information of the target in the first depth map.
In an optional embodiment of the second aspect, the aircraft includes a first binocular camera disposed on the aircraft body for photographing the target, the first binocular camera being electrically connected to the computing device, the acquiring module being specifically configured to acquire a first binocular image of the target photographed by the first binocular camera; a first depth map containing the object is generated from the first binocular image of the object.
In an optional implementation manner of the second aspect, the aircraft includes a second binocular camera, the second binocular camera is disposed at the bottom of the aircraft body and is used for shooting a preset ground area, and the acquiring module is further specifically used for acquiring a second binocular image of the preset ground area shot by the second binocular camera; generating a second depth map of the preset ground area according to the second binocular image to obtain the first depth information.
In a third aspect, an embodiment provides an electronic device, including a memory and a processor, where the memory stores a computer program, and the processor executes the computer program to perform the method in the first aspect or any optional implementation manner of the first aspect.
In a fourth aspect, the embodiments provide a storage medium, on which a computer program is stored, where the computer program, when executed by a processor, performs the method in the first aspect or any optional implementation manner of the first aspect.
In a fifth aspect, embodiments provide a computer program product, which when run on a computer, causes the computer to execute the method of the first aspect or any optional implementation manner of the first aspect.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
FIG. 1 is a schematic structural diagram of an aircraft provided in an embodiment of the present application;
fig. 2 is a first flowchart of a target position determining method according to an embodiment of the present application;
fig. 3 is a second flowchart of a target position determining method provided in the embodiment of the present application;
fig. 4 is a third flowchart of a target position determining method provided in the embodiment of the present application;
fig. 5 is a schematic diagram illustrating a calculation of an inclination angle according to an embodiment of the present disclosure;
fig. 6 is a fourth flowchart of a target position determining method according to an embodiment of the present application;
FIG. 7 is a schematic diagram of target location determination provided by an embodiment of the present application;
fig. 8 is a fifth flowchart of a target position determining method according to an embodiment of the present application;
fig. 9 is a sixth flowchart of a target position determining method according to an embodiment of the present application;
fig. 10 is a seventh flowchart of a target position determining method according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of a target position determining apparatus according to an embodiment of the present application;
fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Icon: 10-an aircraft; 101-an aircraft body; 102-monocular camera; 103-a computing device; 104-a first binocular camera; 105-a second binocular camera; 1000-an acquisition module; 1100-a computing module; 1200-a correction module; 1300-a determination module; 11-an electronic device; 1101-a processor; 1102-a memory; 1103 — communication bus.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
The application provides an aircraft 10, as shown in fig. 1, the aircraft 10 includes an aircraft body 101, a monocular camera 102 and a computing device 103, the monocular camera 102 is disposed on the aircraft body 101, the monocular camera 102 is electrically connected with the computing device 103, the monocular camera is used for shooting a target and sending an image obtained by shooting to the computing device 103, so that the computing device processes the image shot by the monocular camera; as a possible implementation, the monocular camera 102 may be arranged on top of the aircraft body 101.
In addition, the aircraft 10 may further include two binocular cameras, namely a first binocular camera 104 and a second binocular camera 105, wherein the first binocular camera 104 is disposed on the aircraft body to photograph the target; the second binocular camera 105 is disposed at the bottom of the aircraft body 101 to photograph the ground corresponding to the aircraft body 101.
Based on the aircraft 10, the present application designs a target position determining method, which can estimate the position of an operation target, as shown in fig. 2, and specifically includes the following steps:
step S100: the method comprises the steps of obtaining first depth information of a preset ground area of the aircraft right below a first preset moment and a first coordinate of a first preset point of the preset ground area under a world coordinate system.
Step S110: and calculating the inclination angle of the preset ground area relative to the horizontal plane according to the first depth information.
Step S120: the method comprises the steps of obtaining a second coordinate of an optical center of a monocular camera in a world coordinate system at a first preset time and a first image of a target shot by the monocular camera at the first preset time, and obtaining a third coordinate of the target in the first image.
Step S130: and calculating the target coordinate of the target in the world coordinate system at the first preset moment according to the inclination angle, the first coordinate, the second coordinate and the third coordinate.
In step S100, when the position of the target is estimated in real time, the first preset time in the present application may be a current time, when the position is not estimated in real time, the first preset time in the present application may be any time for tracking the target, and when the first preset time is the current time, the step S100 obtains first depth information of a preset ground area of the aircraft right below the current time; as a possible implementation manner, the first preset point of the preset ground area may be any point of the preset ground area, and the first preset point may be a central point corresponding to the preset ground area.
As a possible implementation, on the basis that the aircraft has the aforementioned second binocular camera, the obtaining of the depth information of the preset information may be implemented by the following steps, as specifically shown in fig. 3, including:
step S200: and acquiring a second binocular image of the preset ground area shot by the second binocular camera.
Step S210: and generating a second depth map of the preset ground area according to the second binocular image to obtain the first depth information.
In step S200, the second binocular camera is installed at the bottom of the aircraft body, and is capable of shooting a ground area below the aircraft body, where the preset ground area is a ground area within a certain range that can be shot by the second binocular camera, and when first depth information of the preset ground area directly below the aircraft at the current time needs to be obtained, the computing device controls the second binocular camera to shoot the preset ground area at the current time, so as to obtain a second binocular image of the preset ground area at the current time, and then step S210 is executed to obtain the first depth information.
In step S210, the computing device may generate a second depth image of the preset ground area at the current time according to a second binocular image of the preset ground area at the current time, where a pixel value of each point in the second depth image represents a distance from the ground point to an optical center of the second binocular camera along an optical axis direction of the second binocular camera, and then the pixel value of each point in the second depth image is the first depth information in the present embodiment. It should be noted that the generation process of the second depth image may adopt an existing scheme for generating the depth image based on the binocular image.
On the basis of the above, the first coordinates of the first preset point in the world coordinate system in step S100 may be obtained by first obtaining the coordinates of the first preset point in the second binocular image, and then obtaining the coordinates of the first preset point in the world coordinate system based on the coordinates of the first preset point in the second binocular image based on the transformation translation matrix of the image coordinate system and the world coordinate system. The world coordinate system can be a coordinate system with north-east-ground as a coordinate axis and an aircraft takeoff position as an origin.
After the first depth information of the preset ground area at the current time is obtained in the above manner, step S110 may be executed to calculate an inclination angle of the preset ground area with respect to the horizontal plane according to the first depth information.
As can be seen from the foregoing, the first depth information includes a depth value of each point of the preset ground area, and on this basis, the calculation of the inclination angle of the preset ground area relative to the horizontal plane may specifically be performed in the following manner, as shown in fig. 4, including:
step S300: and acquiring the maximum depth value, the minimum depth value and the distance value of the maximum depth value and the minimum depth value in the first depth information.
Step S310: and calculating the inclination angle of the preset ground area relative to the horizontal plane according to the maximum depth value, the minimum depth value and the distance value of the maximum depth value and the minimum depth value.
In step S300, since the depth value represents a distance from a point on the ground to the second binocular camera along the optical axis direction of the second binocular camera, the maximum depth value is a point farthest from the second binocular camera in the preset ground area, the maximum depth value is considered as a horizontal plane of the preset ground area, and the minimum depth value is considered as a slope of the preset ground area, so that the depth values of the ground are different; the distance values of the maximum depth value and the minimum depth value refer to the horizontal distance value from the point of the maximum depth value to the point of the minimum depth value, and the horizontal distance value can be obtained based on the horizontal distance value in the image and by adopting a certain scale.
After the maximum depth value, the minimum depth value, and the distance values of the maximum depth value and the minimum depth value are obtained in step S300, the depth difference value of the ground area may be obtained according to the maximum depth value and the minimum depth value, and the inclination angle of the preset ground area with respect to the horizontal plane may be obtained based on the depth difference value and the distance values of the maximum depth value and the minimum depth value; specifically, as shown in fig. 5, for example, the point a is the point with the largest depth value, and the point B is the point with the smallest depth value, and after the horizontal distance n from a to B and the depth difference h between a and B are obtained, the value of the angle a can be obtained.
In step S120, the computing device may further capture a first image of the target at a first preset time by using a monocular camera to obtain a coordinate of the target in the first image, that is, a third coordinate, and as a possible implementation manner, may establish a coordinate system based on a certain edge of the first image as an origin and using a horizontal direction and a vertical direction as coordinate axes, and may further obtain the third coordinate of the target in the first image; in addition, the scheme of the application can also obtain a second coordinate of the optical center of the monocular camera under a world coordinate system of the first preset time, wherein the world coordinate system is the coordinate system taking the takeoff position as the origin. It should be noted that, the step S120 and the step S100 may be performed simultaneously, or may be performed sequentially in a sequential order.
After the inclination angle, the first coordinate, the second coordinate, and the third coordinate are obtained in the foregoing manner, step S130 may be executed to calculate a target coordinate of the target in the world coordinate system at the first preset time according to the inclination angle, the first coordinate, the second coordinate, and the third coordinate; as a possible implementation manner, step S130 calculates the target coordinates of the target in the world coordinate system at the first preset time, as shown in fig. 6, specifically by the following steps:
step S500: and determining a first linear equation according to the inclination angle and the first coordinate.
Step S510: and determining a second straight-line equation from the optical center of the monocular camera to the target according to the second coordinate and the third coordinate.
Step S520: and calculating the coordinates of the intersection point of the first linear equation and the second linear equation to obtain the coordinates of the target in the world coordinate system at the first preset moment.
In the above steps, a first linear equation can be calculated according to the inclination angle and the first coordinate; according to the second coordinate and the third coordinate of the target in the first image, a second linear equation from the optical center of the monocular camera to the target can be calculated; and then, the first linear equation and the second linear equation are combined to obtain the intersection point coordinate of the two linear equations, wherein the intersection point coordinate of the two linear equations is the coordinate of the target in the world coordinate system. Specifically, the above-described mode can be explained by referring to fig. 7: obtaining a first linear equation A-B according to the first coordinate A and the inclination angle a; obtaining a second linear equation C-D according to the second coordinate C and the third coordinate D; and then obtaining an intersection point E of the two equations based on the linear equations A-B and C-D, wherein the intersection point E is the target coordinate of the target in the world coordinate system at the first preset moment. When the second linear equation is calculated, since the third coordinate is the image coordinate of the target in the first image, the third coordinate may be converted into a coordinate in a world coordinate system corresponding to the third coordinate, and then the second linear equation may be calculated by combining the second coordinate.
In the designed target position determining method, first depth information of a preset ground area right below an aircraft is obtained, then an inclination angle (gradient) of the preset ground area relative to a horizontal plane is calculated based on the first depth information, a first coordinate of a first preset point of the preset ground area under a world coordinate system at a first preset time, a second coordinate of an optical center of a monocular camera under the world coordinate system at the first preset time and a third coordinate of a target in a first image shot by the monocular camera are further obtained, and finally a target coordinate of the target under the world coordinate system is calculated based on the inclination angle, the first coordinate, the second coordinate and the third coordinate; therefore, when the target position is estimated, the slope of the ground area where the target is located is determined, and the real position of the target is estimated by combining the slope factor, so that the problem that the ground height (depth) is not changed when the target state is estimated in the existing method, but the ground height (depth) is changed at any time under the actual condition, which brings inaccuracy to the target state estimation is solved, and the accuracy of the target position estimation is improved; in addition, the estimation of the target position can be realized by simply solving two linear equation sets simultaneously, so that the method is simple and convenient to calculate, high in processing speed and high in real-time analysis.
In an optional implementation manner of this embodiment, when the aircraft follows the target, after step S130, as shown in fig. 8, the solution of the present application may further include the following steps:
step S700: and acquiring second depth information of a second preset point of the aircraft in the preset ground area at a second preset moment, wherein the second preset moment is the moment when the aircraft is in the target coordinate.
Step S710: and correcting the target coordinate of the target under the world coordinate system at the first preset moment according to the second depth information of the second preset point.
In the above step, since the aircraft performs the following tracking, when the aircraft flies to the target coordinate of the target at the first preset time, the aircraft may capture a depth map of a preset ground area based on the second binocular camera to obtain second depth information of the second preset point, where the second depth information may represent a height value of the target coordinate obtained through the depth map, and then modify the target coordinate of the target at the first preset time in the world coordinate system according to the second depth information of the second preset point, and if the second depth information obtained at this time is not consistent with the height value of the target coordinate of the target at the first preset time, the height value of the target coordinate is modified based on the second depth information, for example, the height coordinate of the intersection point E obtained through the simultaneous linear equation is 10 meters, but when the aircraft flies to the E-point coordinate, the depth information obtained through the depth map is 12 meters (the depth information represents the target coordinate from the ground to the second binocular The distance between the cameras and the second binocular camera is slightly equal to the distance between the cameras and the point E, the obtained depth information is the height value of the point E), on the basis, 10 meters are corrected based on 12 meters, and then the corrected target coordinates of the target at the first preset time under the world coordinate system are obtained.
In an optional implementation manner of this embodiment, in addition to the foregoing determining the target coordinates of the target at the first preset time, the calculation may be performed in the following manner, as shown in fig. 9, where the scheme of the present application may further include the following steps:
step S800: a first depth map containing a target is obtained.
Step S810: and determining a fourth coordinate of the target in the first depth map according to the third coordinate of the target in the first image.
Step S820: and determining the coordinates of the target in the world coordinate system according to the fourth coordinates and the depth information of the target in the first depth map.
In step S800, the present application may obtain a first depth map including an object, and as a possible implementation, it has been described above that the aircraft may further include a first binocular camera, the first binocular camera may be disposed in front of the aircraft, the first binocular camera is also used for shooting the object, and on the basis of the above, as shown in fig. 10, the third depth map information may be obtained specifically by:
step S900: a first binocular image of a subject captured by a first binocular camera is acquired.
Step S910: a first depth map of the target is generated from the first binocular image of the target.
In the above step, the aircraft may control the first binocular camera to capture the target to obtain a first binocular image including the target, and then generate a first depth map including the target based on the first binocular image including the target.
After the first depth map including the target is obtained in the above manner, step S710 may be performed to determine a fourth coordinate of the target in the first depth map according to the third coordinate of the target in the first image.
In step S810, the present application may obtain the transformation relationship of the coordinate systems corresponding to the first binocular camera and the monocular camera in advance, for example, if the first binocular camera and the monocular camera only rotate and do not translate (which is true when the target is far away), the transformation relationship of the coordinate systems of the first binocular camera and the monocular camera is:
Figure BDA0002998486070000161
from the above, the rotation matrix from the monocular camera to the first binocular camera is:
Figure BDA0002998486070000162
wherein, PuvfRepresenting a fourth coordinate in the depth map; puvmRepresenting a third coordinate in the monocular captured first image; kfAnd KmAre all coefficients; z is a radical offIs a depth value; rf←mA rotation matrix for the monocular camera to the first binocular camera; z is a radical ofcIs a depth value in the first image.
After the rotation matrix is obtained, the fourth coordinate of the target in the first depth map may be determined according to the third coordinate of the target in the first image, that is, the tracking frame of the monocular camera is projected into the depth map of the first binocular camera, and then step S820 is executed to determine the coordinate of the target in the world coordinate system according to the fourth coordinate and the depth information of the target in the first depth map.
In step S820, since the depth information of the target in the first depth map is known and the fourth coordinate of the target in the first depth map is known, the target coordinate of the target in the world coordinate system can be obtained.
In the embodiment designed above, the first binocular image is taken of the target by the first binocular camera to obtain the first depth map, then the fourth coordinate of the target in the first depth map is determined based on the third coordinate of the target in the first image, and then the coordinate of the target in the world coordinate system can be obtained based on the fourth coordinate and the depth information of the target in the first depth map, so that it can be seen that the real position of the target in the terrain change process can be estimated by simply using the target depth map of the first binocular camera and the target image of the monocular camera, the problem that the ground height (depth) is assumed not to change when the target state is estimated in the existing method, but the ground height (depth) is changed at any time in the actual situation, which brings inaccuracy to the estimation of the target state is solved, and the accuracy of the estimation of the target position is improved.
Fig. 11 shows a schematic block diagram of the aircraft-based target position determining apparatus provided in the present application, and it should be understood that the apparatus corresponds to the method embodiments executed in fig. 2 to 10, and can execute the steps involved in the foregoing method, and the specific functions of the apparatus can be referred to the description above, and the detailed description is appropriately omitted here to avoid redundancy. The device includes at least one software function that can be stored in memory in the form of software or firmware (firmware) or solidified in the Operating System (OS) of the device. Specifically, the apparatus includes: the acquiring module 1000 is configured to acquire first depth information of a preset ground area of the aircraft right below a first preset time and a first coordinate of a first preset point of the preset ground area in a world coordinate system; a calculating module 1100, configured to calculate an inclination angle of the preset ground area with respect to a horizontal plane according to the first depth information; the acquiring module 1000 is further configured to acquire a second coordinate of the optical center of the monocular camera in the world coordinate system at a first preset time and a first image including the target, which is shot by the monocular camera at the first preset time, and acquire a third coordinate of the target in the first image; the calculating module 1100 is further configured to calculate a target coordinate of the target in the world coordinate system at the first preset time according to the inclination angle, the first coordinate, the second coordinate, and the third coordinate.
In the designed target position determining device based on the aircraft, according to the scheme of the application, first depth information of a preset ground area right below the aircraft is obtained, then an inclination angle (gradient) of the preset ground area relative to a horizontal plane is calculated based on the first depth information, a first coordinate of a first preset point of the preset ground area at a first preset time in a world coordinate system, a second coordinate of an optical center of a monocular camera at the first preset time in the world coordinate system and a third coordinate of a target in a first image shot by the monocular camera are obtained, and finally a target coordinate of the target at the first preset time in the world coordinate system is calculated based on the inclination angle, the first coordinate, the second coordinate and the third coordinate; therefore, when the target position is estimated, the slope of the ground area where the target is located is determined, and the real position of the target is estimated by combining the slope factor, so that the problem that the ground height (depth) is not changed when the target state is estimated in the existing method, but the ground height (depth) is changed at any time under the actual condition, which brings inaccuracy to the target state estimation is solved, and the accuracy of the target position estimation is improved.
In an optional implementation manner of this embodiment, the calculating module 1100 is specifically configured to obtain a maximum depth value, a minimum depth value, and distance values of the maximum depth value and the minimum depth value in the first depth information; and calculating the inclination angle of the preset ground area relative to the horizontal plane according to the maximum depth value, the minimum depth value and the distance value of the maximum depth value and the minimum depth value.
In an optional implementation manner of this embodiment, the calculating module 1100 is further specifically configured to determine a first linear equation according to the inclination angle and the first coordinate; determining a second linear equation from the optical center of the monocular camera to the target according to the second coordinate and the third coordinate; and calculating the coordinates of the intersection point of the first linear equation and the second linear equation to obtain the coordinates of the target under the world coordinate system at the first preset moment.
In an optional implementation manner of this embodiment, the obtaining module 1000 is further configured to obtain second depth information of a second preset point of a preset ground area of the aircraft at a second preset time, where the second preset time is a time when the aircraft is in the target coordinate; and the correcting module 1200 is configured to correct the target coordinate of the target in the world coordinate system at the first preset time according to the second depth information of the second preset point.
In an optional implementation manner of this embodiment, the obtaining module 1000 is further configured to obtain a first depth map including the target; a determining module 1300, configured to determine a fourth coordinate of the target in the first depth map according to the third coordinate of the target in the first image; and determining the coordinates of the target in the world coordinate system according to the fourth coordinates and the depth information of the target in the first depth map.
In an optional implementation manner of this embodiment, the aircraft includes a first binocular camera, the first binocular camera is disposed on the aircraft body for shooting a target, the first binocular camera is electrically connected to the computing device, and the acquiring module 1000 is specifically configured to acquire a first binocular image of the target shot by the first binocular camera; a first depth map containing the object is generated from the first binocular image of the object.
In an optional implementation manner of this embodiment, the aircraft includes a second binocular camera, the second binocular camera is disposed at the bottom of the aircraft body for shooting the preset ground area, and the obtaining module 1000 is further specifically configured to obtain a second binocular image of the preset ground area shot by the second binocular camera; and generating a second depth map of the preset ground area according to the second binocular image to obtain the first depth information.
As shown in fig. 12, the present application provides an electronic device 11 including: a processor 1101 and a memory 1102, the processor 1101 and the memory 1102 being interconnected and communicating with each other via a communication bus 1103 and/or other form of connection mechanism (not shown), the memory 1102 storing a computer program executable by the processor 1101, the processor 1101 executing the computer program when the computing device is running to perform the method of the first embodiment, any alternative implementation of the first embodiment, such as the steps S100 to S130: acquiring first depth information of a preset ground area of the aircraft right below a first preset moment and a first coordinate of a first preset point of the preset ground area under a world coordinate system; calculating an inclination angle of the preset ground area relative to a horizontal plane according to the first depth information; acquiring a second coordinate of an optical center of the monocular camera in a world coordinate system at a first preset moment and a first image of a target shot by the monocular camera at the first preset moment, and acquiring a third coordinate of the target in the first image; and calculating the target coordinate of the target in the world coordinate system at the first preset moment according to the inclination angle, the first coordinate, the second coordinate and the third coordinate.
The present application provides a storage medium having a computer program stored thereon, where the computer program is executed by a processor to perform the method of the first embodiment or any alternative implementation manner of the first embodiment.
The storage medium may be implemented by any type of volatile or nonvolatile storage device or combination thereof, such as a Static Random Access Memory (SRAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), an Erasable Programmable Read-Only Memory (EPROM), a Programmable Read-Only Memory (PROM), a Read-Only Memory (ROM), a magnetic Memory, a flash Memory, a magnetic disk, or an optical disk.
The present application provides a computer program product which, when run on a computer, causes the computer to perform the method of the first embodiment, any of its alternative implementations.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
In addition, units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
Furthermore, the functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
It should be noted that the functions, if implemented in the form of software functional modules and sold or used as independent products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. An aircraft-based target position determination method, wherein the aircraft comprises an aircraft body, a monocular camera and a computing device, the monocular camera is arranged on the aircraft body to shoot the target, the computing device is electrically connected with the monocular camera, and the method is applied to the computing device and comprises the following steps:
acquiring first depth information of a preset ground area of the aircraft right below a first preset moment and a first coordinate of a first preset point of the preset ground area under a world coordinate system;
calculating an inclination angle of the preset ground area relative to a horizontal plane according to the first depth information;
acquiring a second coordinate of the optical center of the monocular camera in a world coordinate system at the first preset moment and a first image of the target shot by the monocular camera at the first preset moment, and acquiring a third coordinate of the target in the first image;
and calculating the target coordinate of the target in the world coordinate system at the first preset moment according to the inclination angle, the first coordinate, the second coordinate and the third coordinate.
2. The method of claim 1, wherein calculating an inclination angle of the preset ground area with respect to a horizontal plane from the first depth information comprises:
acquiring a maximum depth value, a minimum depth value and distance values of the maximum depth value and the minimum depth value in the first depth information;
and calculating the inclination angle of the preset ground area relative to the horizontal plane according to the maximum depth value, the minimum depth value and the distance value between the maximum depth value and the minimum depth value.
3. The method of claim 1, wherein the calculating target coordinates of the target in the world coordinate system at the first preset time based on the tilt angle, the first coordinate, the second coordinate, and the third coordinate comprises:
determining a first linear equation according to the inclination angle and the first coordinate;
determining a second linear equation from the optical center of the monocular camera to the target according to the second coordinate and the third coordinate;
and calculating the coordinates of the intersection point of the first linear equation and the second linear equation to obtain the coordinates of the target in the world coordinate system at the first preset moment.
4. The method of claim 1, wherein after the calculating target coordinates of the target in the world coordinate system at the first preset time based on the tilt angle, the first coordinate, the second coordinate, and the third coordinate, the method further comprises:
acquiring second depth information of a second preset point of a preset ground area of the aircraft at a second preset time, wherein the second preset time is the time when the aircraft is located at the target coordinate;
and correcting the target coordinate of the target under the world coordinate system at the first preset moment according to the second depth information of the second preset point.
5. The method of claim 1, further comprising:
acquiring a first depth map containing the target;
determining a fourth coordinate of the target in the first depth map according to the third coordinate of the target in the first image;
and determining the coordinates of the target in a world coordinate system according to the fourth coordinates and the depth information of the target in the first depth map.
6. The method of claim 5, wherein the aerial vehicle comprises a first binocular camera disposed on the aerial vehicle body for capturing the target, the first binocular camera electrically connected to the computing device, the acquiring a first depth map containing the target comprising:
acquiring a first binocular image of the target photographed by the first binocular camera;
a first depth map containing the object is generated from the first binocular image of the object.
7. The method of claim 1, wherein the aircraft comprises a second binocular camera disposed at a bottom of the aircraft body for photographing a preset ground area, and the acquiring the first depth information of the ground area of the aircraft directly below the first preset time comprises:
acquiring a second binocular image of the preset ground area shot by the second binocular camera;
generating a second depth map of the preset ground area according to the second binocular image to obtain the first depth information.
8. An aircraft-based target position determination apparatus, wherein the aircraft includes an aircraft body, a monocular camera provided on the aircraft body to photograph the target, and a computing device electrically connected to the monocular camera, the apparatus being applied to the computing device, comprising:
the acquiring module is used for acquiring first depth information of a preset ground area of the aircraft right below a first preset moment and a first coordinate of a first preset point of the preset ground area in a world coordinate system;
the calculation module is used for calculating the inclination angle of the preset ground area relative to the horizontal plane according to the first depth information;
the acquiring module is further configured to acquire a second coordinate of the optical center of the monocular camera in the world coordinate system at the first preset time and a first image including the target, which is shot by the monocular camera at the first preset time, and acquire a third coordinate of the target in the first image;
the calculation module is further configured to calculate a target coordinate of the target in the world coordinate system at a first preset time according to the inclination angle, the first coordinate, the second coordinate, and the third coordinate.
9. An electronic device comprising a memory and a processor, the memory storing a computer program, wherein the processor implements the method of any one of claims 1 to 7 when executing the computer program.
10. A storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the method of any of claims 1 to 7.
CN202110344049.9A 2021-03-30 2021-03-30 Target position determining method and device based on aircraft Pending CN113077436A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110344049.9A CN113077436A (en) 2021-03-30 2021-03-30 Target position determining method and device based on aircraft

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110344049.9A CN113077436A (en) 2021-03-30 2021-03-30 Target position determining method and device based on aircraft

Publications (1)

Publication Number Publication Date
CN113077436A true CN113077436A (en) 2021-07-06

Family

ID=76611783

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110344049.9A Pending CN113077436A (en) 2021-03-30 2021-03-30 Target position determining method and device based on aircraft

Country Status (1)

Country Link
CN (1) CN113077436A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113824516A (en) * 2021-08-06 2021-12-21 星展测控科技股份有限公司 Video receiving method, video receiving equipment and readable storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113824516A (en) * 2021-08-06 2021-12-21 星展测控科技股份有限公司 Video receiving method, video receiving equipment and readable storage medium
CN113824516B (en) * 2021-08-06 2024-01-12 星展测控科技股份有限公司 Video receiving method, video receiving device and readable storage medium

Similar Documents

Publication Publication Date Title
US10942529B2 (en) Aircraft information acquisition method, apparatus and device
CN105959625B (en) Method and device for controlling unmanned aerial vehicle to track and shoot
CN106529495B (en) Obstacle detection method and device for aircraft
CN112037260B (en) Position estimation method and device for tracking target and unmanned aerial vehicle
CN108364319B (en) Dimension determination method and device, storage medium and equipment
CN106873619B (en) Processing method of flight path of unmanned aerial vehicle
KR102664900B1 (en) Apparatus for measuring ground control point using unmanned aerial vehicle and method thereof
CN113048980B (en) Pose optimization method and device, electronic equipment and storage medium
CN112781586B (en) Pose data determination method and device, electronic equipment and vehicle
CN113409391A (en) Visual positioning method and related device, equipment and storage medium
CN113587934B (en) Robot, indoor positioning method and device and readable storage medium
CN108921898B (en) Camera pose determination method and device, electronic equipment and computer readable medium
CN111241224A (en) Method, system, computer device and storage medium for target distance estimation
JP2023127588A (en) Information processing device
CN115953483A (en) Parameter calibration method and device, computer equipment and storage medium
CN113077436A (en) Target position determining method and device based on aircraft
JP5991821B2 (en) Photogrammetry equipment
US20240071018A1 (en) Smooth object correction for augmented reality devices
CN110800023A (en) Image processing method and equipment, camera device and unmanned aerial vehicle
CN113129422A (en) Three-dimensional model construction method and device, storage medium and computer equipment
WO2020107487A1 (en) Image processing method and unmanned aerial vehicle
JP2021043486A (en) Position estimating device
CN116518981B (en) Aircraft visual navigation method based on deep learning matching and Kalman filtering
CN113593023B (en) Three-dimensional drawing method, device, equipment and storage medium
CN116907448A (en) Tilt image adjustment method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
AD01 Patent right deemed abandoned
AD01 Patent right deemed abandoned

Effective date of abandoning: 20240621