CN116817929A - Method and system for simultaneously positioning multiple targets on ground plane by unmanned aerial vehicle - Google Patents

Method and system for simultaneously positioning multiple targets on ground plane by unmanned aerial vehicle Download PDF

Info

Publication number
CN116817929A
CN116817929A CN202311085054.8A CN202311085054A CN116817929A CN 116817929 A CN116817929 A CN 116817929A CN 202311085054 A CN202311085054 A CN 202311085054A CN 116817929 A CN116817929 A CN 116817929A
Authority
CN
China
Prior art keywords
image acquisition
target point
center
acquisition equipment
north
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311085054.8A
Other languages
Chinese (zh)
Other versions
CN116817929B (en
Inventor
申晓雷
何举刚
马迎辉
王璟
朱茅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Ordnance Equipment Group Ordnance Equipment Research Institute
Original Assignee
China Ordnance Equipment Group Ordnance Equipment Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Ordnance Equipment Group Ordnance Equipment Research Institute filed Critical China Ordnance Equipment Group Ordnance Equipment Research Institute
Priority to CN202311085054.8A priority Critical patent/CN116817929B/en
Publication of CN116817929A publication Critical patent/CN116817929A/en
Application granted granted Critical
Publication of CN116817929B publication Critical patent/CN116817929B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application provides a method and a system for simultaneously positioning multiple targets on a ground plane by an unmanned aerial vehicle, wherein the method comprises the steps of identifying pixel coordinates of all target points to be monitored from environment image information acquired by image acquisition equipment; acquiring pose information of a visual field center at the current moment of the image acquisition equipment and pixel coordinates of the image center, and calculating the north-east pose of each target point based on the physical focal length and the physical pixel size of the image acquisition equipment and the pixel coordinates of the target point; based on the height information of the unmanned aerial vehicle and the north-east posture of each target point, the distance between each target point and the optical center of the image acquisition equipment is calculated, based on the north-east posture of each target point, the north-east coordinate of each target point relative to the optical center of the image acquisition equipment is calculated, and the longitude and latitude information of each target point is calculated by combining the longitude and latitude information of the unmanned aerial vehicle, so that each target point is positioned. The application solves the problem of simultaneously positioning all targets in the shot image.

Description

Method and system for simultaneously positioning multiple targets on ground plane by unmanned aerial vehicle
Technical Field
The application belongs to the technical field of target tracking, and particularly relates to a method and a system for simultaneously positioning multiple targets on a ground plane by an unmanned aerial vehicle.
Background
Along with unmanned aerial vehicle technical development, unmanned aerial vehicle's application scenario in the military field is also more and more, utilizes unmanned aerial vehicle's photoelectricity nacelle to fix a position the target of shooing, is important application. The photoelectric pod is equipment which is arranged on various unmanned platforms such as unmanned aerial vehicles, unmanned boats and the like and the unmanned platforms to form various systems for implementing optical reconnaissance and monitoring tasks on the area. The specific process of target positioning is that firstly, a visual method is used for detecting the target of the image, and then longitude and latitude calculation is carried out on the target.
The target positioning depends on the attitude information of the camera relative to the north east, the distance between the target and the optical center of the camera and the longitude and latitude information of the optical center of the camera. The method for realizing target positioning is to directly utilize the unmanned aerial vehicle holder, laser ranging and GPS data to calculate longitude and latitude of the center position of the video field, but in general, only a single laser beam of the unmanned aerial vehicle photoelectric pod strikes the center of the video field of the video camera, so that the unmanned aerial vehicle can only acquire the distance from the center of the video field of the video camera to the origin of the video camera. There are therefore limitations to targeting in this manner. Another method for realizing target positioning is to use a triangulation method to position any target, which involves feature matching and camera internal and external parameter estimation, so that the target positioning in the method has an unstable problem.
Disclosure of Invention
In order to solve the technical problems, the application provides a method and a system for simultaneously positioning multiple targets of an unmanned aerial vehicle on a ground plane.
The application discloses a method for simultaneously positioning multiple targets on a ground plane by an unmanned aerial vehicle; the unmanned aerial vehicle comprises image acquisition equipment, a cradle head, a GPS device and a laser ranging device;
the image acquisition equipment is arranged on the cradle head, the view center of the image acquisition equipment is adjusted by controlling the rotation of the cradle head, the north east earth pose of the view center of the image acquisition equipment is obtained by obtaining the north east earth pose of the cradle head, the GPS device is used for obtaining the longitude and latitude information of the unmanned aerial vehicle, and the laser ranging device is used for obtaining the laser ranging value of the image acquisition equipment;
the method comprises the following steps:
s1, identifying pixel coordinates of all target points to be monitored from environment image information acquired by image acquisition equipment;
s2, acquiring pose information of the visual field center of the image acquisition equipment and pixel coordinates of the image center at the current moment, and calculating the north-east pose of each target point based on the physical focal length and the physical pixel size of the image acquisition equipment and the pixel coordinates of the target point;
the pose information of the view center of the image acquisition device at the current moment comprises the north-east pose of the view center of the image acquisition device at the current moment and the laser ranging value at the current moment; the physical focal length and the physical size of the pixels of the image acquisition equipment are internal parameters of the image acquisition equipment;
s3, calculating the distance from each target point to the optical center of the image acquisition equipment based on the height information of the unmanned aerial vehicle and the north-east posture of each target point;
s4, calculating the north-east coordinate of each target point relative to the optical center of the image acquisition equipment based on the distance from each target point to the optical center of the image acquisition equipment and the north-east posture of each target point, and calculating the longitude and latitude information of each target point by combining the longitude and latitude information of the unmanned aerial vehicle so as to position each target point.
According to the method of the first aspect of the present application, in step S2, the step of calculating the north-east posture of each target point includes:
s21, obtaining pixel coordinates from the target point to the center of the image based on the difference value between the pixel coordinates of the target point and the pixel coordinates of the center of the image;
s22, obtaining physical coordinates from the target point to the center of the image based on the product of the physical size of the pixel of the image acquisition device and the pixel coordinates from the target point to the center of the image;
s23, based on the physical coordinates of the target point to the image center, the physical focal length of the image acquisition equipment and the laser ranging value, obtaining angle information of the image acquisition equipment to be rotated around the coordinate axis of the image acquisition equipment when the visual field center of the image acquisition equipment is to be adjusted to the target point;
s24, according to the angle information that the image acquisition equipment rotates around the coordinate axis of the image acquisition equipment, the view center of the image acquisition equipment is adjusted to the target point, and based on the north-east posture of the view center of the image acquisition equipment at the current moment, the north-east posture of the target point is calculated.
According to the method of the first aspect of the present application, in the step S23, obtaining angle information about a coordinate axis of the image capturing device about which the image capturing device is to rotate when the center of the field of view of the image capturing device is to be adjusted to the target point includes:
s231, acquiring an X-direction distance from a target point to the center of the field of view of the image acquisition device and a Y-direction distance from the target point to the center of the field of view of the image acquisition device;
the equation for obtaining the X-direction distance from the target point to the center of the field of view of the image acquisition device is:
wherein focus is the physical focal length of the image acquisition device, c_distance is the laser ranging value, fabs is an absolute function,for the X-axis coordinate value of the physical coordinate from the target point to the image center, img_x is the X-axis coordinate value of the pixel coordinate of the target point, cx is the X-axis coordinate value of the pixel coordinate of the image center, dx is the physical size of the pixel on the X-axis of the image acquisition device, dist_x is the X-direction distance from the target point to the field center of the image acquisition device;
the equation for obtaining the Y-direction distance from the target point to the center of the field of view of the image acquisition device is:
wherein ,img_y is the Y-axis coordinate value of the pixel coordinate of the target point to the image center, cy is the Y-axis coordinate value of the pixel coordinate of the image center, dy is the physical size of the pixel on the Y-axis of the image acquisition device, dist_y is the Y-direction distance from the target point to the field center of view of the image acquisition device;
s232, obtaining an included angle between a straight line L formed by the target point and the optical center of the image acquisition device and an optical axis z of the image acquisition device in an xz plane and an yz plane according to a laser ranging value of the center of the view of the image acquisition device, an X-direction distance between the target point and the center of the view of the image acquisition device and a Y-direction distance between the target point and the center of the view of the image acquisition device;
the included angle between the straight line L and the optical axis z of the image acquisition device in the xz plane is as follows:
ay=arctan(dist_x / c_distance)
the included angle between the straight line L and the optical axis z of the image acquisition device in the yz plane is as follows:
ax=arctan(dist_y / c_distance)
the image acquisition equipment rotates around the y axis of the image acquisition equipment, the corresponding ay is marked as d_yaw, the image acquisition equipment rotates around the x axis of the image acquisition equipment, the corresponding ax is marked as d_pitch, namely the angle information of the image acquisition equipment, which is required to rotate around the coordinate axis of the image acquisition equipment, at the position, where the center of the visual field of the image acquisition equipment is to be adjusted to the target point is d_yaw and d_pitch;
the first rotation matrix d_R is constructed from d_yw and d_pitch.
According to the method of the first aspect of the present application, the step S24 specifically includes:
acquiring a north-east posture R2 of a target point based on the product of the north-east posture of the visual field center of the image acquisition equipment at the current moment and the first rotation matrix d_R;
wherein R1 is north east pose yaw1, pitch1, roll1 of the visual field center of the image acquisition device at the current moment, and Euler angles corresponding to R2 are yaw2, pitch2, roll2.
According to the method of the first aspect of the present application, the formula for calculating the distance from the target point to the optical center of the image acquisition device is:
distance2 = height / sin(fabs(pitch2))
distance2 is the distance from the target point to the image acquisition device, height is the height information of the unmanned aerial vehicle, and pitch2 is the pitch angle in the north-east attitude of the target point.
According to the method of the first aspect of the application, the formula for calculating the north-east coordinates of the target point relative to the optical center of the image acquisition device is:
wherein east, depth, north are respectively east axis value, earth axis value and north axis value under north east-to-earth coordinate, and when image acquisition equipment z axis points to north, x axis points to north, and y axis points to ground up, corresponds with N, E, D of north east-to-earth coordinate system.
The method according to the first aspect of the application, the method further comprising:
based on the difference value between the north-east posture of each target point and the north-east posture of the visual field center of the image acquisition equipment at the current moment, acquiring an angle of the cloud deck to be rotated if the visual field center of the image acquisition equipment is to be adjusted to the target point;
the angle that the cloud platform needs to rotate: yaw2-yaw1, pitch2-pitch1, roll2-roll1.
The application discloses a system for simultaneously positioning multiple targets of an unmanned aerial vehicle to a ground plane; the unmanned aerial vehicle comprises image acquisition equipment, a cradle head, a GPS device and a laser ranging device;
the image acquisition equipment is arranged on the cradle head, the view center of the image acquisition equipment is adjusted by controlling the rotation of the cradle head, the north-east attitude of the view center of the image acquisition equipment is obtained by obtaining the north-east attitude of the cradle head of the unmanned aerial vehicle, the GPS device is used for obtaining the longitude and latitude information of the unmanned aerial vehicle, and the laser ranging device is used for obtaining the laser ranging value of the image acquisition equipment;
the system comprises:
the detection module is configured to identify pixel coordinates of all target points to be monitored from the environment image information acquired by the image acquisition equipment;
the first processing module is configured to acquire pose information of the visual field center of the image acquisition device and pixel coordinates of the image center at the current moment, and calculate the north-east pose of each target point based on the physical focal length and the physical pixel size of the image acquisition device and the pixel coordinates of the target point;
the pose information of the view center of the image acquisition device at the current moment comprises the north-east pose of the view center of the image acquisition device at the current moment and the laser ranging value at the current moment; the physical focal length and the physical size of the pixels of the image acquisition equipment are internal parameters of the image acquisition equipment;
the second processing module is configured to calculate the distance between each target point and the optical center of the image acquisition device based on the height information of the unmanned aerial vehicle and the north-east posture of each target point;
the third processing module is configured to calculate the north-east coordinate of each target point relative to the optical center of the image acquisition device based on the distance from each target point to the optical center of the image acquisition device and the north-east posture of each target point, and calculate the longitude and latitude information of each target point by combining the longitude and latitude information of the unmanned aerial vehicle so as to position each target point.
A third aspect of the application discloses an electronic device. The electronic device comprises a memory and a processor, the memory storing a computer program, the processor implementing the steps in a method for simultaneous localization of a drone to a ground plane in accordance with any one of the first aspects of the present disclosure when the computer program is executed.
A fourth aspect of the application discloses a computer-readable storage medium. A computer readable storage medium has stored thereon a computer program which, when executed by a processor, implements the steps in a method for simultaneous localization of a drone to a ground plane in any one of the first aspects of the present disclosure.
In summary, the scheme provided by the application has the following technical effects: according to the application, the latitude and longitude are further calculated through the north-east coordinates of the target point relative to the camera optical center, so that the problem that the visual field center of the unmanned aerial vehicle camera is automatically adjusted to any target position is solved, and the problem that all targets in a shot image are simultaneously positioned is solved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are needed in the description of the embodiments or the prior art will be briefly described, and it is obvious that the drawings in the description below are some embodiments of the present application, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a method for simultaneous localization of multiple targets of a drone to a ground plane according to an embodiment of the present application;
fig. 2 is a block diagram of a system for simultaneous localization of multiple targets of a drone to a ground plane according to an embodiment of the present application;
fig. 3 is a block diagram of an electronic device according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
It will be understood that the terms first, second, etc. as used herein may be used to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another element. For example, a first image may be referred to as a second image, and similarly, a second image may be referred to as a first image, without departing from the scope of the application. Both the first image and the second image are images, but they are not the same image.
The application discloses a method for simultaneously positioning multiple targets on a ground plane by an unmanned aerial vehicle; the unmanned aerial vehicle comprises image acquisition equipment, a cradle head, a GPS device and a laser ranging device;
the image acquisition equipment is arranged on the cradle head, the view center of the image acquisition equipment is adjusted by controlling the rotation of the cradle head, the north east earth pose of the view center of the image acquisition equipment is obtained by obtaining the north east earth pose of the cradle head, the GPS device is used for obtaining the longitude and latitude information of the unmanned aerial vehicle, and the laser ranging device is used for obtaining the laser ranging value of the view center position of the image acquisition equipment; and acquiring pose information of the visual field center of the camera through a cradle head and laser ranging equipment of the unmanned aerial vehicle. The image acquisition device will be described as a camera.
Referring to fig. 1, the method comprises the steps of:
s1, identifying pixel coordinates of all target points to be monitored from environment image information acquired by image acquisition equipment;
in the step, the pixel coordinates of the target are obtained by detecting the target of the unmanned aerial vehicle video stream.
S2, acquiring pose information of the visual field center of the image acquisition equipment and pixel coordinates of the image center at the current moment, and calculating the north-east pose of each target point based on the physical focal length and the physical pixel size of the image acquisition equipment and the pixel coordinates of the target point;
the pose information of the view center of the image acquisition device at the current moment comprises the north-east pose of the view center of the image acquisition device at the current moment and the laser ranging value at the current moment; the physical focal length and the physical size of the pixels of the image acquisition equipment are internal parameters of the image acquisition equipment;
according to the method of the first aspect of the present application, in step S2, the step of calculating the north-east posture of each target point includes:
s21, obtaining pixel coordinates from the target point to the center of the image based on the difference value between the pixel coordinates of the target point and the pixel coordinates of the center of the image;
s22, obtaining physical coordinates from the target point to the center of the image based on the product of the physical size of the pixel of the image acquisition device and the pixel coordinates from the target point to the center of the image;
s23, based on the physical coordinates of the target point to the image center, the physical focal length of the image acquisition equipment and the laser ranging value, obtaining angle information of the image acquisition equipment to be rotated around the coordinate axis of the image acquisition equipment when the visual field center of the image acquisition equipment is to be adjusted to the target point;
in the step S23, obtaining the angle information that the image capturing device needs to rotate around the coordinate axis of the image capturing device when the center of the field of view of the image capturing device is to be adjusted to the target point includes:
s231, acquiring an X-direction distance from a target point to the center of the field of view of the image acquisition device and a Y-direction distance from the target point to the center of the field of view of the image acquisition device;
the equation for obtaining the X-direction distance from the target point to the center of the field of view of the image acquisition device is:
wherein focus is the physical focal length of the image acquisition device, c_distance is the laser ranging value, fabs is an absolute function,for the physical of the target point to the center of the imageAn X-axis coordinate value of coordinates, img_x is an X-axis coordinate value of a pixel coordinate of the target point, cx is an X-axis coordinate value of a pixel coordinate of an image center, dx is a physical size of a pixel on an X-axis of the image acquisition device, and dist_x is an X-direction distance from the target point to the view center of the image acquisition device;
the equation for obtaining the Y-direction distance from the target point to the center of the field of view of the image acquisition device is:
wherein ,img_y is the Y-axis coordinate value of the pixel coordinate of the target point to the image center, cy is the Y-axis coordinate value of the pixel coordinate of the image center, dy is the physical size of the pixel on the Y-axis of the image acquisition device, dist_y is the Y-direction distance from the target point to the field center of view of the image acquisition device;
s232, obtaining an included angle between a straight line L formed by the target point and the optical center of the image acquisition device and an optical axis z of the image acquisition device in an xz plane and an yz plane according to a laser ranging value of the center of the view of the image acquisition device, an X-direction distance between the target point and the center of the view of the image acquisition device and a Y-direction distance between the target point and the center of the view of the image acquisition device;
the included angle between the straight line L and the optical axis z of the image acquisition device in the xz plane is as follows:
ay=arctan(dist_x / c_distance)
the included angle between the straight line L and the optical axis z of the image acquisition device in the yz plane is as follows:
ax=arctan(dist_y / c_distance)
the image acquisition equipment rotates around the y axis of the image acquisition equipment, the corresponding ay is marked as d_yaw, the image acquisition equipment rotates around the x axis of the image acquisition equipment, the corresponding ax is marked as d_pitch, namely the angle information of the image acquisition equipment, which is required to rotate around the coordinate axis of the image acquisition equipment, at the position, where the center of the visual field of the image acquisition equipment is to be adjusted to the target point is d_yaw and d_pitch;
yaw angle yaw of the cradle head, right yaw is positive, range [ -PI, PI ]; roll angle roll, roll positive to the right, range [ -PI, PI ].
The origin of the world coordinate system is the camera optical center. When the camera pose is 0,0 and 0, the camera z axis corresponds to the north direction, the camera x axis corresponds to the east direction, and the camera y axis corresponds to the ground upward direction.
If the center of the visual field is automatically adjusted to the target point:
when img_y < cy, the camera rotates counterclockwise about the x-axis by ax, i.e., heads up, where d_pitch is positive and vice versa.
At img_x < cx, the camera rotates zy counterclockwise about the y-axis, i.e., yawing left, where d_yaw is negative and vice versa.
The first rotation matrix d_R is constructed from d_yw and d_pitch. d_yw and d_pitch are angle information of the camera to be rotated around a camera coordinate axis if the center of the field of view of the camera is to be adjusted to the target point, and correspond to a first rotation matrix d_R.
S24, according to the angle information that the image acquisition equipment rotates around the coordinate axis of the image acquisition equipment, the view center of the image acquisition equipment is adjusted to the target point, and based on the north-east posture of the view center of the image acquisition equipment at the current moment, the north-east posture of the target point is calculated.
Acquiring a north-east posture R2 of a target point based on the product of the north-east posture of the visual field center of the image acquisition equipment at the current moment and the first rotation matrix d_R;
wherein R1 is north east pose yaw1, pitch1, roll1 of the visual field center of the image acquisition device at the current moment, and Euler angles corresponding to R2 are yaw2, pitch2, roll2.
In this step, the world coordinate system is the northeast coordinate system. The values of the pan/tilt head yaw, pitch, roll are all in the north east coordinate system. The camera on the unmanned aerial vehicle is a small-hole imaging model. The center of field laser ranging value is c_distance. x, y, z are three axes of the camera coordinate system, and the z axis is the camera optical axis direction.
The origin of the world coordinate system is the camera optical center. When the camera pose is 0,0 and 0, the camera z axis corresponds to the north direction, the camera x axis corresponds to the east direction, and the camera y axis corresponds to the ground upward direction. The camera optical center and the GPS origin point are considered to be the same position, and the longitude and latitude of the position can be obtained.
Yaw angle yaw of the cradle head, right yaw is positive, range [ -PI, PI ]; roll angle roll, roll positive to the right, range [ -PI, PI ].
S3, calculating the distance from each target point to the optical center of the image acquisition equipment based on the height information of the unmanned aerial vehicle and the north-east posture of each target point;
according to the method of the first aspect of the present application, the formula for calculating the distance from the target point to the optical center of the image acquisition device is:
distance2 = height / sin(fabs(pitch2))
distance2 is the distance from the target point to the image acquisition device, height is the height information of the unmanned aerial vehicle, and pitch2 is the pitch angle in the north-east attitude of the target point.
S4, calculating the north-east coordinate of each target point relative to the optical center of the image acquisition equipment based on the distance from each target point to the optical center of the image acquisition equipment and the north-east posture of each target point, and calculating the longitude and latitude information of each target point by combining the longitude and latitude information of the unmanned aerial vehicle so as to position each target point.
According to the method of the first aspect of the application, the formula for calculating the north-east coordinates of the target point relative to the optical center of the image acquisition device is:
wherein east, depth, north are the east axis value, the earth axis value and the north axis value at the north east-to-earth coordinates, respectively. When the z-axis of the image acquisition device points to north, the x-axis points to north, and the y-axis points to the ground upwards, corresponding to N, E, D of the north-east coordinate system.
The method according to the first aspect of the application, the method further comprising:
based on the difference value between the north-east posture of each target point and the north-east posture of the visual field center of the image acquisition equipment at the current moment, acquiring an angle of the cloud deck to be rotated if the visual field center of the image acquisition equipment is to be adjusted to the target point;
the angle that the cloud platform needs to rotate: yaw2-yaw1, pitch2-pitch1, roll2-roll1.
The method solves the problem that the center of the visual field of the unmanned aerial vehicle camera is automatically adjusted to any target position.
The step of adjusting the cradle head is as follows:
under the world coordinate system, the initial Euler angles of the cradle head at the current moment are yaw1, pitch1 and roll1, and the initial gesture corresponding to the camera is recorded as R1;
according to the scheme, the angles d_yw and d_pitch of the camera to be adjusted under the camera coordinate system are calculated, and the corresponding rotation matrix is d_R;
after the lens is adjusted, the center of the visual field is aligned with the target point, and the corresponding posture of the target point, namely the new center of the visual field, isIf the Euler angles corresponding to R2 are yaw2, pitch2 and roll2, the original visual field center is automatically adjusted to the target point, and the angles of the cradle head required to rotate are yaw2-yaw1, pitch2-pitch1 and roll2-roll1.
The application discloses a system for simultaneously positioning multiple targets of an unmanned aerial vehicle to a ground plane; the unmanned aerial vehicle comprises image acquisition equipment, a cradle head, a GPS device and a laser ranging device;
the image acquisition equipment is arranged on the cradle head, the view center of the image acquisition equipment is adjusted by controlling the rotation of the cradle head, the north-east attitude of the view center of the image acquisition equipment is obtained by obtaining the north-east attitude of the cradle head of the unmanned aerial vehicle, the GPS device is used for obtaining the longitude and latitude information of the unmanned aerial vehicle, and the laser ranging device is used for obtaining the laser ranging value of the image acquisition equipment;
referring to fig. 2, the system 100 includes:
the detection module 101 is configured to identify pixel coordinates of all target points to be monitored from environment image information acquired by the image acquisition device;
a first processing module 102 configured to obtain pose information of a field center of view of the image capturing device and pixel coordinates of the image center at a current moment, and calculate a north-east pose of each target point based on a physical focal length and a physical pixel size of the image capturing device and the pixel coordinates of the target point;
the pose information of the view center of the image acquisition device at the current moment comprises the north-east pose of the view center of the image acquisition device at the current moment and the laser ranging value at the current moment; the physical focal length and the physical size of the pixels of the image acquisition equipment are internal parameters of the image acquisition equipment;
a second processing module 103 configured to calculate a distance from each target point to the optical center of the image capturing device based on the altitude information of the unmanned aerial vehicle and the north-east posture of each target point;
the third processing module 104 is configured to calculate the north-east coordinate of each target point relative to the optical center of the image capturing device based on the distance between each target point and the optical center of the image capturing device and the north-east posture of each target point, and calculate the longitude and latitude information of each target point by combining the longitude and latitude information of the unmanned aerial vehicle, so as to position each target point.
A third aspect of the application discloses an electronic device. The electronic device comprises a memory and a processor, the memory storing a computer program, the processor implementing the steps in a method for simultaneous localization of a drone to a ground plane in accordance with any one of the first aspects of the present disclosure when the computer program is executed.
Fig. 3 is a block diagram of an electronic device according to an embodiment of the present application, and as shown in fig. 3, the electronic device includes a processor, a memory, a communication interface, a display screen, and an input device connected through a system bus. Wherein the processor of the electronic device is configured to provide computing and control capabilities. The memory of the electronic device includes a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The communication interface of the electronic device is used for conducting wired or wireless communication with an external terminal, and the wireless communication can be achieved through WIFI, an operator network, near Field Communication (NFC) or other technologies. The display screen of the electronic equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the electronic equipment can be a touch layer covered on the display screen, can also be keys, a track ball or a touch pad arranged on the shell of the electronic equipment, and can also be an external keyboard, a touch pad or a mouse and the like.
It will be appreciated by those skilled in the art that the structure shown in fig. 3 is merely a block diagram of a portion related to the technical solution of the present disclosure, and does not constitute a limitation of the electronic device to which the technical solution of the present disclosure is applied, and that a specific electronic device may include more or less components than those shown in the drawings, or may combine some components, or have different component arrangements.
A fourth aspect of the application discloses a computer-readable storage medium. A computer readable storage medium has stored thereon a computer program which, when executed by a processor, implements steps of a method for simultaneous localization of a drone to a ground plane of any one of the first aspects of the present disclosure.
In summary, the technical scheme provided by the application has the following technical effects:
note that the technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be regarded as the scope of the description. The foregoing examples illustrate only a few embodiments of the application, which are described in detail and are not to be construed as limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of protection of the present application is to be determined by the appended claims.

Claims (10)

1. A method for simultaneously positioning multiple targets on a ground plane by an unmanned aerial vehicle is characterized in that,
the unmanned aerial vehicle comprises image acquisition equipment, a cradle head, a GPS device and a laser ranging device;
the image acquisition equipment is arranged on the cradle head, the view center of the image acquisition equipment is adjusted by controlling the rotation of the cradle head, the north east earth pose of the view center of the image acquisition equipment is obtained by obtaining the north east earth pose of the cradle head, the GPS device is used for obtaining the longitude and latitude information of the unmanned aerial vehicle, and the laser ranging device is used for obtaining the laser ranging value of the view center position of the image acquisition equipment;
the method comprises the following steps:
s1, identifying pixel coordinates of all target points to be monitored from environment image information acquired by image acquisition equipment;
s2, acquiring pose information of the visual field center of the image acquisition equipment and pixel coordinates of the image center at the current moment, and calculating the north-east pose of each target point based on the physical focal length and the physical pixel size of the image acquisition equipment and the pixel coordinates of the target point;
the pose information of the view center of the image acquisition device at the current moment comprises the north-east pose of the view center of the image acquisition device at the current moment and the laser ranging value at the current moment; the physical focal length and the physical size of the pixels of the image acquisition equipment are internal parameters of the image acquisition equipment;
s3, calculating the distance from each target point to the optical center of the image acquisition equipment based on the height information of the unmanned aerial vehicle and the north-east posture of each target point;
s4, calculating the north-east coordinate of each target point relative to the optical center of the image acquisition equipment based on the distance from each target point to the optical center of the image acquisition equipment and the north-east posture of each target point, and calculating the longitude and latitude information of each target point by combining the longitude and latitude information of the unmanned aerial vehicle so as to position each target point.
2. The method for simultaneous localization of multiple targets to a ground plane by an unmanned aerial vehicle according to claim 1, wherein in step S2, the step of calculating the north-east pose of each target point comprises:
s21, obtaining pixel coordinates from the target point to the center of the image based on the difference value between the pixel coordinates of the target point and the pixel coordinates of the center of the image;
s22, obtaining physical coordinates from the target point to the center of the image based on the product of the physical size of the pixel of the image acquisition device and the pixel coordinates from the target point to the center of the image;
s23, based on the physical coordinates of the target point to the image center, the physical focal length of the image acquisition equipment and the laser ranging value, obtaining angle information of the image acquisition equipment to be rotated around the coordinate axis of the image acquisition equipment when the visual field center of the image acquisition equipment is to be adjusted to the target point;
s24, according to the angle information that the image acquisition equipment rotates around the coordinate axis of the image acquisition equipment, the view center of the image acquisition equipment is adjusted to the target point, and based on the north-east posture of the view center of the image acquisition equipment at the current moment, the north-east posture of the target point is calculated.
3. The method for simultaneous localization of a ground plane with respect to a multi-target unmanned aerial vehicle according to claim 2, wherein in step S23, obtaining angle information about a coordinate axis of the image capturing device about which the image capturing device is to be rotated when the center of view of the image capturing device is to be adjusted to the target point comprises:
s231, acquiring an X-direction distance from a target point to the center of the field of view of the image acquisition device and a Y-direction distance from the target point to the center of the field of view of the image acquisition device;
the equation for obtaining the X-direction distance from the target point to the center of the field of view of the image acquisition device is:
wherein focus is the physical focal length of the image acquisition device, c_distance is the laser ranging value, fabs is an absolute function,for the X-axis coordinate value of the physical coordinate from the target point to the image center, img_x is the X-axis coordinate value of the pixel coordinate of the target point, cx is the X-axis coordinate value of the pixel coordinate of the image center, dx is the physical size of the pixel on the X-axis of the image acquisition device, dist_x is the X-direction distance from the target point to the field center of the image acquisition device;
the equation for obtaining the Y-direction distance from the target point to the center of the field of view of the image acquisition device is:
wherein ,img_y is the Y-axis coordinate value of the pixel coordinate of the target point to the image center, cy is the Y-axis coordinate value of the pixel coordinate of the image center, dy is the physical size of the pixel on the Y-axis of the image acquisition device, dist_y is the Y-direction distance from the target point to the field center of view of the image acquisition device;
s232, obtaining an included angle between a straight line L formed by the target point and the optical center of the image acquisition device and an optical axis z of the image acquisition device in an xz plane and an yz plane according to a laser ranging value of the center of the view of the image acquisition device, an X-direction distance between the target point and the center of the view of the image acquisition device and a Y-direction distance between the target point and the center of the view of the image acquisition device;
the included angle between the straight line L and the optical axis z of the image acquisition device in the xz plane is as follows:
ay=arctan(dist_x / c_distance)
the included angle between the straight line L and the optical axis z of the image acquisition device in the yz plane is as follows:
ax=arctan(dist_y / c_distance)
the image acquisition equipment rotates around the y axis of the image acquisition equipment, the corresponding ay is marked as d_yaw, the image acquisition equipment rotates around the x axis of the image acquisition equipment, the corresponding ax is marked as d_pitch, namely the angle information of the image acquisition equipment, which is required to rotate around the coordinate axis of the image acquisition equipment, at the position, where the center of the visual field of the image acquisition equipment is to be adjusted to the target point is d_yaw and d_pitch;
the first rotation matrix d_R is constructed from d_yw and d_pitch.
4. A method for simultaneous localization of multiple objects to a ground plane by an unmanned aerial vehicle according to claim 3, wherein step S24 specifically comprises:
acquiring a north-east posture R2 of a target point based on the product of the north-east posture of the visual field center of the image acquisition equipment at the current moment and the first rotation matrix d_R;
wherein R1 is north east pose yaw1, pitch1, roll1 of the visual field center of the image acquisition device at the current moment, and Euler angles corresponding to R2 are yaw2, pitch2, roll2.
5. The method for simultaneous localization of ground level multiple targets by an unmanned aerial vehicle according to claim 4, wherein the formula for calculating the distance from the target point to the optical center of the image acquisition device is:
distance2 = height / sin(fabs(pitch2))
distance2 is the distance from the target point to the image acquisition device, height is the height information of the unmanned aerial vehicle, and pitch2 is the pitch angle in the north-east attitude of the target point.
6. The method for simultaneously positioning multiple targets on a ground plane by using an unmanned aerial vehicle according to claim 5, wherein the formula for calculating the north-east coordinates of the target point relative to the optical center of the image acquisition device is as follows:
wherein east, depth, north are respectively east axis value, earth axis value and north axis value under north east-to-earth coordinate, and when image acquisition equipment z axis points to north, x axis points to north, and y axis points to ground up, corresponds with N, E, D of north east-to-earth coordinate system.
7. The method for simultaneous localization of a drone to a ground plane in accordance with claim 6, further comprising:
based on the difference value between the north-east posture of each target point and the north-east posture of the visual field center of the image acquisition equipment at the current moment, acquiring an angle of the cloud deck to be rotated if the visual field center of the image acquisition equipment is to be adjusted to the target point;
the angle that the cloud platform needs to rotate: yaw2-yaw1, pitch2-pitch1, roll2-roll1.
8. The system for simultaneously positioning the ground plane multiple targets by the unmanned aerial vehicle is characterized by comprising image acquisition equipment, a cradle head, a GPS device and a laser ranging device;
the image acquisition equipment is arranged on the cradle head, the view center of the image acquisition equipment is adjusted by controlling the rotation of the cradle head, the north-east attitude of the view center of the image acquisition equipment is obtained by obtaining the north-east attitude of the cradle head of the unmanned aerial vehicle, the GPS device is used for obtaining the longitude and latitude information of the unmanned aerial vehicle, and the laser ranging device is used for obtaining the laser ranging value of the image acquisition equipment;
the system comprises:
the detection module is configured to identify pixel coordinates of all target points to be monitored from the environment image information acquired by the image acquisition equipment;
the first processing module is configured to acquire pose information of the visual field center of the image acquisition device and pixel coordinates of the image center at the current moment, and calculate the north-east pose of each target point based on the physical focal length and the physical pixel size of the image acquisition device and the pixel coordinates of the target point;
the pose information of the view center of the image acquisition device at the current moment comprises the north-east pose of the view center of the image acquisition device at the current moment and the laser ranging value at the current moment; the physical focal length and the physical size of the pixels of the image acquisition equipment are internal parameters of the image acquisition equipment;
the second processing module is configured to calculate the distance between each target point and the optical center of the image acquisition device based on the height information of the unmanned aerial vehicle and the north-east posture of each target point;
the third processing module is configured to calculate the north-east coordinate of each target point relative to the optical center of the image acquisition device based on the distance from each target point to the optical center of the image acquisition device and the north-east posture of each target point, and calculate the longitude and latitude information of each target point by combining the longitude and latitude information of the unmanned aerial vehicle so as to position each target point.
9. An electronic device comprising a memory and a processor, the memory storing a computer program, the processor implementing the steps in a method for simultaneous localization of a drone to a ground plane according to any one of claims 1 to 7 when the computer program is executed.
10. A computer readable storage medium, characterized in that it has stored thereon a computer program which, when executed by a processor, implements the steps of a method for simultaneous localization of a drone to a ground plane, according to any one of claims 1 to 7.
CN202311085054.8A 2023-08-28 2023-08-28 Method and system for simultaneously positioning multiple targets on ground plane by unmanned aerial vehicle Active CN116817929B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311085054.8A CN116817929B (en) 2023-08-28 2023-08-28 Method and system for simultaneously positioning multiple targets on ground plane by unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311085054.8A CN116817929B (en) 2023-08-28 2023-08-28 Method and system for simultaneously positioning multiple targets on ground plane by unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN116817929A true CN116817929A (en) 2023-09-29
CN116817929B CN116817929B (en) 2023-11-10

Family

ID=88114777

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311085054.8A Active CN116817929B (en) 2023-08-28 2023-08-28 Method and system for simultaneously positioning multiple targets on ground plane by unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN116817929B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117809217A (en) * 2023-12-26 2024-04-02 浙江大学 Method and system for scouting and beating based on real-time single-stage target recognition

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104501779A (en) * 2015-01-09 2015-04-08 中国人民解放军63961部队 High-accuracy target positioning method of unmanned plane on basis of multi-station measurement
CN111167052A (en) * 2020-03-18 2020-05-19 沈阳天目科技有限公司 Automatic fire monitor target longitude and latitude calculation method based on camera positioning
WO2021189456A1 (en) * 2020-03-27 2021-09-30 深圳市大疆创新科技有限公司 Unmanned aerial vehicle inspection method and apparatus, and unmanned aerial vehicle
CN113920186A (en) * 2021-10-13 2022-01-11 中国电子科技集团公司第五十四研究所 Low-altitude unmanned-machine multi-source fusion positioning method
US20220221857A1 (en) * 2019-05-09 2022-07-14 Sony Group Corporation Information processing apparatus, information processing method, program, and information processing system
CN114812558A (en) * 2022-04-19 2022-07-29 中山大学 Monocular vision unmanned aerial vehicle autonomous positioning method combined with laser ranging
CN114973037A (en) * 2022-06-15 2022-08-30 中国人民解放军军事科学院国防科技创新研究院 Unmanned aerial vehicle intelligent detection and synchronous positioning multi-target method
CN116385504A (en) * 2023-03-15 2023-07-04 智洋创新科技股份有限公司 Inspection and ranging method based on unmanned aerial vehicle acquisition point cloud and image registration

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104501779A (en) * 2015-01-09 2015-04-08 中国人民解放军63961部队 High-accuracy target positioning method of unmanned plane on basis of multi-station measurement
US20220221857A1 (en) * 2019-05-09 2022-07-14 Sony Group Corporation Information processing apparatus, information processing method, program, and information processing system
CN111167052A (en) * 2020-03-18 2020-05-19 沈阳天目科技有限公司 Automatic fire monitor target longitude and latitude calculation method based on camera positioning
WO2021189456A1 (en) * 2020-03-27 2021-09-30 深圳市大疆创新科技有限公司 Unmanned aerial vehicle inspection method and apparatus, and unmanned aerial vehicle
CN113920186A (en) * 2021-10-13 2022-01-11 中国电子科技集团公司第五十四研究所 Low-altitude unmanned-machine multi-source fusion positioning method
CN114812558A (en) * 2022-04-19 2022-07-29 中山大学 Monocular vision unmanned aerial vehicle autonomous positioning method combined with laser ranging
CN114973037A (en) * 2022-06-15 2022-08-30 中国人民解放军军事科学院国防科技创新研究院 Unmanned aerial vehicle intelligent detection and synchronous positioning multi-target method
CN116385504A (en) * 2023-03-15 2023-07-04 智洋创新科技股份有限公司 Inspection and ranging method based on unmanned aerial vehicle acquisition point cloud and image registration

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117809217A (en) * 2023-12-26 2024-04-02 浙江大学 Method and system for scouting and beating based on real-time single-stage target recognition

Also Published As

Publication number Publication date
CN116817929B (en) 2023-11-10

Similar Documents

Publication Publication Date Title
CN110264520B (en) Vehicle-mounted sensor and vehicle pose relation calibration method, device, equipment and medium
CN110146869B (en) Method and device for determining coordinate system conversion parameters, electronic equipment and storage medium
US20220077820A1 (en) Method and system for soar photovoltaic power station monitoring
US20180160045A1 (en) Method and device of image processing and camera
KR102295809B1 (en) Apparatus for acquisition distance for all directions of vehicle
CN116817929B (en) Method and system for simultaneously positioning multiple targets on ground plane by unmanned aerial vehicle
WO2021016854A1 (en) Calibration method and device, movable platform, and storage medium
WO2020133172A1 (en) Image processing method, apparatus, and computer readable storage medium
CN113194263B (en) Gun and ball linkage control method and device, computer equipment and storage medium
CN115861860B (en) Target tracking and positioning method and system for unmanned aerial vehicle
WO2020198963A1 (en) Data processing method and apparatus related to photographing device, and image processing device
CN111213159A (en) Image processing method, device and system
CN114979956A (en) Unmanned aerial vehicle aerial photography ground target positioning method and system
CN111273701A (en) Visual control system and control method for holder
WO2020019175A1 (en) Image processing method and apparatus, and photographing device and unmanned aerial vehicle
CN113436267A (en) Visual inertial navigation calibration method and device, computer equipment and storage medium
CN113034347A (en) Oblique photographic image processing method, device, processing equipment and storage medium
CN112396662B (en) Conversion matrix correction method and device
CN115379390A (en) Unmanned aerial vehicle positioning method and device, computer equipment and storage medium
CN114494423B (en) Unmanned platform load non-central target longitude and latitude positioning method and system
CN113223076B (en) Coordinate system calibration method, device and storage medium for vehicle and vehicle-mounted camera
CN115131433A (en) Non-cooperative target pose processing method and device and electronic equipment
CN113301248B (en) Shooting method and device, electronic equipment and computer storage medium
CN111753565B (en) Method and electronic equipment for presenting information related to optical communication device
WO2021134715A1 (en) Control method and device, unmanned aerial vehicle and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant