CN113189890A - Simulation method and device for unmanned aerial vehicle target positioning - Google Patents

Simulation method and device for unmanned aerial vehicle target positioning Download PDF

Info

Publication number
CN113189890A
CN113189890A CN202010035235.XA CN202010035235A CN113189890A CN 113189890 A CN113189890 A CN 113189890A CN 202010035235 A CN202010035235 A CN 202010035235A CN 113189890 A CN113189890 A CN 113189890A
Authority
CN
China
Prior art keywords
target
image
unmanned aerial
aerial vehicle
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010035235.XA
Other languages
Chinese (zh)
Inventor
陈宇楠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingbangda Trade Co Ltd
Beijing Jingdong Qianshi Technology Co Ltd
Original Assignee
Beijing Jingbangda Trade Co Ltd
Beijing Jingdong Qianshi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingbangda Trade Co Ltd, Beijing Jingdong Qianshi Technology Co Ltd filed Critical Beijing Jingbangda Trade Co Ltd
Priority to CN202010035235.XA priority Critical patent/CN113189890A/en
Publication of CN113189890A publication Critical patent/CN113189890A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B17/00Systems involving the use of models or simulators of said systems
    • G05B17/02Systems involving the use of models or simulators of said systems electric

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the invention discloses a simulation method and a simulation device for unmanned aerial vehicle target positioning, and relates to the technical field of target positioning. The method comprises the following steps: when the unmanned aerial vehicle model flies, acquiring a first image comprising a first target to be positioned through a photoelectric pod model carried on the unmanned aerial vehicle model; performing image processing on the first image to obtain a pixel coordinate of the first target in the first image; acquiring state information of the unmanned aerial vehicle model and the photoelectric pod model; and calculating a simulated value of the geographic position of the first target in the three-dimensional simulation environment based on the pixel coordinates and the state information. According to the simulation method for unmanned aerial vehicle target positioning, the accuracy of the simulation result of unmanned aerial vehicle target positioning is improved, the actual test cost is reduced, and the possibility of landing of the algorithm is improved.

Description

Simulation method and device for unmanned aerial vehicle target positioning
Technical Field
The invention relates to the technical field of target positioning, in particular to a simulation method and device for unmanned aerial vehicle target positioning.
Background
Currently, unmanned aerial vehicles are commonly used to detect and track ground targets. The flight condition of the unmanned aerial vehicle is harsh, and meanwhile, the cooperation of a flyer is needed in the flight process. When a new algorithm is developed, if the algorithm is not verified by a complete system, the real aircraft test is directly carried out through the unmanned aerial vehicle, and once the aircraft is broken or the surrounding environment is damaged, huge economic loss is caused. The faults that unmanned aerial vehicle took place when carrying out real quick-witted test are all irreversible, and the experimental condition also is difficult to reappear, and the fault reason is difficult to analyze clearly. Moreover, the time and effort of the flyer is often required to operate the unmanned aerial vehicle, wasting manpower and material resources. As a novel field, the unmanned aerial vehicle simulation system can exclude errors occurring in an algorithm logic layer in advance, and is a necessary process for algorithm development.
In the related technology, the MATLAB is commonly used for realizing unmanned aerial vehicle simulation. MATLAB simulation is a non-real-time and off-line simulation method. The simulation emphasizes the effects of local control, obstacle avoidance, tracking and other algorithms, and does not relate to the stage of acquiring data by the sensor. The simulation input is artificially given data, and the mathematical model of the unmanned aerial vehicle has a certain gap from the real model. This reduces the accuracy of the algorithm simulation while extending the time of algorithm development.
Disclosure of Invention
In view of this, the embodiment of the present invention provides a simulation method for unmanned aerial vehicle target positioning, which does not require human input data, and meanwhile, an unmanned aerial vehicle model and a photoelectric pod model adopted by the simulation method are consistent with a real model, so that accuracy of a simulation result of unmanned aerial vehicle target positioning is improved, actual test cost is reduced, and possibility of landing an algorithm is improved.
According to one aspect of the invention, a simulation method for unmanned aerial vehicle target positioning is provided, wherein an unmanned aerial vehicle model is built in a three-dimensional simulation environment, and the simulation method for unmanned aerial vehicle target positioning comprises the following steps:
when the unmanned aerial vehicle model flies, acquiring a first image comprising a first target to be positioned through a photoelectric pod model carried on the unmanned aerial vehicle model;
performing image processing on the first image to obtain a pixel coordinate of the first target in the first image;
acquiring state information of the unmanned aerial vehicle model and the photoelectric pod model; and
based on the pixel coordinates and the state information, a simulated value of the geographic location of the first target in the three-dimensional simulation environment is calculated.
Preferably, the simulation method for unmanned aerial vehicle target positioning further includes:
obtaining a position true value of the geographic position of the first target in the three-dimensional simulation environment;
comparing the location true value and the simulated value to determine a simulated error of the geographic location of the first target.
Preferably, when the unmanned aerial vehicle model flies, the step of acquiring a first image including a first target to be positioned through an optoelectronic pod model mounted on the unmanned aerial vehicle model includes:
selecting the first target to be positioned from a plurality of images acquired by the photoelectric pod model;
tracking the first target based on a kernel correlation filtering algorithm tracking program; and
and in the process of acquiring the first image in real time by the photoelectric pod model carried by the unmanned aerial vehicle model, adjusting the state information of the unmanned aerial vehicle model and the photoelectric pod model in real time to enable the first image to comprise the first target.
Preferably, the image processing the first image to obtain the pixel coordinates of the first target in the first image includes:
converting the image format of the first image to obtain a second image;
and in a cross-platform computer vision library, carrying out image processing on the second image to obtain the pixel coordinates of the first target in the first image.
Preferably, said calculating a simulated value of the geographic position of said first target in said three-dimensional simulation environment based on said pixel coordinates and said state information comprises:
and based on the pixel coordinates and the state information, performing coordinate system transformation on the pixel coordinates of the first target in the first image to obtain a simulated value of the geographic position of the first target in the three-dimensional simulation environment.
Preferably, the state information of the unmanned aerial vehicle model and the optoelectronic pod model includes: the position of the unmanned aerial vehicle model, the attitude of the unmanned aerial vehicle model, the position of the photoelectric pod model and the attitude of the photoelectric pod model.
According to another aspect of the present invention, there is provided a simulation apparatus for drone target positioning, including:
the image acquisition module is used for acquiring a first image comprising a first target to be positioned in real time;
the image processing module is used for carrying out image processing on the first image to obtain the pixel coordinates of the first target in the first image;
the data acquisition module is used for acquiring the state information of the unmanned aerial vehicle model and the photoelectric pod model in real time; and
and the calculating module is used for calculating a simulation value of the geographical position of the first target in the three-dimensional simulation environment in a real-time simulation mode based on the pixel coordinates and the state information.
Preferably, the simulation apparatus for drone target positioning further includes:
the tracking module is used for tracking the first target in real time through a tracking program of a nuclear correlation filtering algorithm and obtaining a position true value of the geographical position of the first target in the three-dimensional simulation environment in real time;
a comparison module for comparing the location true value and the simulation value to determine a simulation error of the unmanned aerial vehicle model locating the geographic location of the first target.
Preferably, the acquiring in real time a first image comprising a first target to be positioned comprises:
selecting the first target to be positioned from a plurality of images acquired in real time;
tracking the first target based on a kernel correlation filtering algorithm tracking program; and
adjusting the state information of the unmanned aerial vehicle model and the optoelectronic pod model in real time during the acquisition of the first image in real time so that the first image includes the first target.
Preferably, the image processing the first image to obtain the pixel coordinates of the first target in the first image includes:
converting the image format of the first image to obtain a second image;
and in a cross-platform computer vision library, carrying out image processing on the second image to obtain the pixel coordinates of the first target in the first image.
Preferably, said calculating, based on said pixel coordinates and said state information, simulated values of the geographic position of said first target in said three-dimensional simulation environment in real-time simulation comprises:
and performing coordinate system transformation on the pixel coordinates of the first target in the first image based on the pixel coordinates and the state information, and calculating a simulation value of the geographical position of the first target in the three-dimensional simulation environment in real time.
Preferably, the state information of the unmanned aerial vehicle model and the optoelectronic pod model includes: the position of the unmanned aerial vehicle model, the attitude of the unmanned aerial vehicle model, the position of the photoelectric pod model and the attitude of the photoelectric pod model.
According to a further aspect of the present invention, there is provided a computer readable storage medium storing computer instructions which, when executed, implement the method of simulation of drone target localization as described above.
According to another aspect of the present invention, there is provided a simulation control device for target positioning of an unmanned aerial vehicle, including:
a memory for storing computer instructions; a processor coupled to the memory, the processor configured to execute a simulation method implementing drone target location as described above based on computer instructions stored by the memory.
One embodiment of the present invention has the following advantages or benefits:
the simulation of the unmanned aerial vehicle target positioning is a complete closed-loop real-time simulation, and data does not need to be input manually, so that the logic of a simulation system is complete. And the unmanned aerial vehicle model and the photoelectric pod model adopted by the simulation method are consistent with the real model, so that the accuracy of the simulation result of the unmanned aerial vehicle target positioning is improved.
The position true value and the simulated value of the geographic position of the first target in the three-dimensional simulation environment can be directly obtained, and the position true value and the simulated value of the first target are conveniently compared to determine the simulation error of the geographic position of the first target. And analyzing the simulation error, and adjusting and optimizing parameters of an algorithm for unmanned aerial vehicle target positioning. Therefore, the development time of the algorithm for target positioning of the unmanned aerial vehicle is shortened, the actual test cost is reduced, and the possibility of landing of the algorithm is improved.
Drawings
The above and other objects, features and advantages of the present invention will become more apparent from the following description of the embodiments of the present invention with reference to the accompanying drawings, in which:
fig. 1 shows a schematic flow diagram of a simulation method for drone target location according to an embodiment of the present invention.
Fig. 2a shows a schematic structural diagram of an optoelectronic pod model according to an embodiment of the present invention.
Figure 2b shows a schematic structural diagram of an unmanned aerial vehicle model of one embodiment of the present invention.
Fig. 3 shows a schematic flow chart of a simulation method for drone target location according to an embodiment of the present invention.
Fig. 4 shows a schematic diagram of an algorithm development flow.
Fig. 5a shows a schematic diagram of a three-dimensional simulation environment for drone target localization of one embodiment of the present invention.
Fig. 5b shows a schematic diagram of simulation results of drone target positioning in the x-axis direction of one embodiment of the present invention.
Fig. 5c is a schematic diagram of simulation results of drone target location in the y-axis direction of one embodiment of the present invention.
Fig. 5d shows a schematic diagram of simulation errors of drone target positioning in the x-axis direction of one embodiment of the present invention.
Fig. 5e shows a schematic diagram of simulation errors of drone target positioning in the y-axis direction of one embodiment of the present invention.
Fig. 6 shows a schematic structural diagram of a simulation apparatus for drone target location according to an embodiment of the present invention.
Fig. 7 shows a schematic structural diagram of a simulation apparatus for drone target location according to an embodiment of the present invention.
Fig. 8 shows a structural diagram of a simulation control apparatus for drone target location according to an embodiment of the present invention.
Detailed Description
The present invention will be described below based on examples, but the present invention is not limited to only these examples. In the following detailed description of the present invention, certain specific details are set forth. It will be apparent to one skilled in the art that the present invention may be practiced without these specific details. Well-known methods, procedures, and procedures have not been described in detail so as not to obscure the present invention. The figures are not necessarily drawn to scale.
Fig. 1 is a schematic flow chart of a simulation method for unmanned aerial vehicle target positioning according to an embodiment of the present invention. The method for building the unmanned aerial vehicle model and the unmanned aerial vehicle target positioning simulation in the three-dimensional simulation environment specifically comprises the following steps:
in step S101, a first image including a first target to be positioned is acquired by a photoelectric pod model mounted thereon while the unmanned aerial vehicle model is flying.
In this step, while the unmanned aerial vehicle model is flying, a first image including a first target to be positioned is acquired through a photoelectric pod model mounted thereon.
In step S102, the first image is subjected to image processing to obtain pixel coordinates of the first target in the first image.
In this step, the first image is subjected to image processing to obtain pixel coordinates of the first object in the first image.
In step S103, status information of the unmanned aerial vehicle model and the photovoltaic pod model is acquired.
In this step, status information of the drone model and the optoelectronic pod model is obtained.
In step S104, a simulated value of the geographical position of the first target in the three-dimensional simulation environment is calculated based on the pixel coordinates and the state information.
In this step, a simulated value of the geographic position of the first target in the three-dimensional simulation environment is calculated based on the pixel coordinates of the first target in the first image and the state information of the drone model and the optoelectronic pod model.
According to the embodiment of the invention, the simulation of the target positioning of the unmanned aerial vehicle is a complete closed-loop real-time simulation, and data does not need to be input manually, so that the logic of a simulation system is complete. And the unmanned aerial vehicle model and the photoelectric pod model adopted by the simulation method are consistent with the real model, so that the accuracy of the simulation result of the unmanned aerial vehicle target positioning is improved.
An unmanned aerial vehicle operating system (ROS) is a set of computer operating system architecture specially designed for robot software development. It is an open source meta-level operating system (post-os) that provides os-like services including hardware abstraction description, underlying driver management, shared function execution, inter-program messaging, program distribution package management, and it also provides tools and libraries for acquiring, building, writing, and executing multi-machine converged programs.
Fig. 3 is a flowchart illustrating a simulation method for drone target positioning, for example, performed in a robot three-dimensional simulation software (gazebo) based on a software framework of a robot operating system (ros), according to an embodiment of the present invention. The embodiment is a more perfect simulation method for unmanned aerial vehicle target positioning than the previous embodiment. The method specifically comprises the following steps:
in step S301, a first image including a first target to be positioned is acquired by a photoelectric pod model mounted thereon while the unmanned aerial vehicle model is flying.
The photoelectric pod model in the three-dimensional simulation environment as shown in fig. 2a acquires images in real time to monitor the surrounding environment in real time for finding a target. The photovoltaic pod model was mounted on a PX4 open source quad-rotor unmanned aerial vehicle model as shown in fig. 2 b. The photoelectric pod model is a two-degree-of-freedom photoelectric pod model.
And controlling the posture and the position of the unmanned aerial vehicle model and the posture and the position of the photoelectric pod model through keyboard operation. The photoelectric pod model collects images in real time through a built-in camera to monitor the surrounding environment in real time so as to find a target.
In this step, a first target to be positioned is selected from a plurality of images acquired by the optoelectronic pod model. For example, when a target to be positioned appears in a plurality of images acquired by the photoelectric pod model in real time, the target is selected as a first target by manually framing the target. And tracking the first target based on a tracking program of a kernel correlation filtering algorithm after the first target to be positioned is found. And in the process of acquiring the first image in real time by the photoelectric pod model carried by the unmanned aerial vehicle model, adjusting the state information of the unmanned aerial vehicle model and the photoelectric pod model in real time to enable the first image to comprise the first target. The state information of the unmanned aerial vehicle model and the photoelectric pod model comprises: the position of the unmanned aerial vehicle model, the attitude of the unmanned aerial vehicle model, the position of the photoelectric pod model and the attitude of the photoelectric pod model.
In step S302, the first image is subjected to image processing to obtain pixel coordinates of the first target in the first image.
In the step, the first image is subjected to image format conversion, and a second image in an image format supported by a cross-platform computer vision library (opencv) is obtained. And (3) in a cross-platform computer vision library (opencv), carrying out image processing on the second image to obtain the pixel coordinates of the first target in the first image.
In step S303, status information of the unmanned aerial vehicle model and the photovoltaic pod model is acquired.
In this step, the state information of the unmanned aerial vehicle model and the photoelectric pod model is acquired. The state information of the unmanned aerial vehicle model and the photoelectric pod model comprises: the position of the unmanned aerial vehicle model, the attitude of the unmanned aerial vehicle model, the position of the photoelectric pod model and the attitude of the photoelectric pod model.
In step S304, a simulation value of the geographical position of the first target in the three-dimensional simulation environment is calculated based on the pixel coordinates and the state information.
FIG. 4 is a schematic diagram of an algorithm development flow. As shown in fig. 4, the algorithm development process includes:
first, a demand analysis is performed on the demand of the user. And obtaining a preliminary scheme of the algorithm through preliminary design. And then, coding to obtain a bottom layer functional algorithm and an upper layer logic algorithm based on the user requirements. Secondly, the upper layer logic algorithm is simulated. And thirdly, judging whether each index of the developed algorithm is qualified or not based on the simulation result. And if the algorithm is qualified, further performing hardware and environment tests on the algorithm. And if the upper layer logic algorithm is not qualified, adjusting and optimizing various parameters of the upper layer logic algorithm.
In the step, in the three-dimensional simulation environment, the coordinate system of the pixel coordinate of the first target in the first image is transformed based on the pixel coordinate of the first target in the first image and the state information of the unmanned aerial vehicle model and the photoelectric pod model, so that a simulated value of the geographical position of the first target in the three-dimensional simulation environment is obtained. For example, simulated values of the geographical position of the first target in the three-dimensional simulation environment are calculated by importing into the three-dimensional simulation environment previously written algorithms regarding positioning of the drone targets.
In step S305, a position true value of the geographic position of the first target in the three-dimensional simulation environment is obtained.
In the step, the first target is tracked in real time through a tracking program of a kernel correlation filtering algorithm, and a position true value of the geographic position of the first target in the three-dimensional simulation environment is obtained in real time.
In step S306, the location true value and the simulation value are compared to determine a simulation error of the geographic location of the first target.
In this step, the location true and simulated values are compared to determine a simulation error for the geographic location of the first target. And analyzing the simulation error, and adjusting and optimizing parameters of an algorithm for unmanned aerial vehicle target positioning.
According to the embodiment of the invention, the position true value and the simulation value of the geographic position of the first target in the three-dimensional simulation environment can be directly obtained, so that the position true value and the simulation value of the first target can be conveniently compared to determine the simulation error of the geographic position of the first target. And analyzing the simulation error, and adjusting and optimizing parameters of an algorithm for unmanned aerial vehicle target positioning. Therefore, the development time of the algorithm for target positioning of the unmanned aerial vehicle is shortened, the actual test cost is reduced, and the possibility of landing of the algorithm is improved.
In an alternative embodiment of the present application, fig. 5a shows a three-dimensional simulation environment for drone target localization. In the three-dimensional simulation environment for unmanned aerial vehicle target positioning as shown in fig. 5a, the unmanned aerial vehicle model 501 takes off to a certain height (50m-100m), and after the posture of the photoelectric pod model carried on the unmanned aerial vehicle model 501 is adjusted, one target in the visual field range is selected as a first target, and here, the white vehicle 502 is selected. The white vehicle 502 is at rest. It can be understood that the state of the white vehicle 502 is not limited to the stationary state, and the case that the white vehicle 502 is in the moving state is also applicable to the three-dimensional simulation method for unmanned aerial vehicle target positioning in the embodiment of the present invention.
The optoelectronic pod model captures a first image including a white vehicle 502 through a built-in camera. And converting the image format of the first image to obtain a second image. In a cross-platform computer vision library (opencv), the second image is subjected to image processing to obtain the pixel coordinates (x) of the white vehicle 502 in the first imageip,yip). The built-in sensors of the unmanned aerial vehicle model 501 acquire the state information of the unmanned aerial vehicle model 501 and the photoelectric pod model in real time.
In the three-dimensional simulation environment, simulated values of the geographic location of the white vehicle 502 in the three-dimensional simulation environment are calculated based on the pixel coordinates and the state information. Only simulated values of the geographic position of the white vehicle 502 in the x-axis direction and the y-axis direction are considered here. Fig. 5b is a simulation result of drone target location in the x-axis direction of a white vehicle 502. The abscissa in fig. 5b is the time t (unit: seconds) in the three-dimensional simulation environment and the ordinate is a simulated value of the geographical position of the white vehicle 502 in the x-axis direction in the three-dimensional simulation environment. Fig. 5c is a simulation result of drone target location in the y-axis direction of the white vehicle 502. In fig. 5c, the abscissa is the time t (unit: seconds) in the three-dimensional simulation environment and the ordinate is the simulated value of the geographical position of the white vehicle 502 in the y-axis direction in the three-dimensional simulation environment. The white vehicle 502 is tracked based on a kernel correlation filtering algorithm tracking program to obtain a location true value for the geographic location of the white vehicle 502. The position truth of the white vehicle 502 in the three-dimensional simulation environment is (56.6, -1.7, 0). The position true and simulated values are compared to determine a simulated error for the geographic position of the white vehicle 502. Fig. 5d is a simulated error of drone target location in the x-axis direction of the white vehicle 502. The abscissa in fig. 5d is the time t (unit: seconds) in the three-dimensional simulation environment, and the ordinate is the simulation error of the geographical position of the white vehicle 502 in the x-axis direction in the three-dimensional simulation environment. Fig. 5e is a simulated error of drone target location in the y-axis direction of the white vehicle 502. In fig. 5e, the abscissa is the time t (unit: seconds) in the three-dimensional simulation environment, and the ordinate is the simulation error of the geographical position of the white vehicle 502 in the y-axis direction in the three-dimensional simulation environment. Comparing the simulation results shown in fig. 5b and 5d and fig. 5c and 5e, the simulation error of the unmanned aerial vehicle target positioning is within the range of 5 meters.
Fig. 6 is a schematic structural diagram of a simulation apparatus for drone target location according to an embodiment of the present invention. As shown in fig. 6, the simulation apparatus for drone target positioning includes: an image acquisition module 601, an image processing module 602, a data acquisition module 603 and a calculation module 604.
The image acquisition module 601 is configured to acquire a first image including a first target to be positioned in real time.
The module collects a first image comprising a first target to be positioned in real time when the unmanned aerial vehicle model flies.
An image processing module 602, configured to perform image processing on the first image to obtain a pixel coordinate of the first target in the first image.
The module is used for carrying out image processing on the first image to obtain the pixel coordinates of the first target in the first image.
And the data acquisition module 603 is used for acquiring the state information of the unmanned aerial vehicle model and the photoelectric pod model in real time.
The module acquires state information of the unmanned aerial vehicle model and the photoelectric pod model.
A calculating module 604, configured to calculate, in real-time simulation, a simulation value of the geographic position of the first target in the three-dimensional simulation environment based on the pixel coordinates and the state information.
The module calculates a simulated value of a geographic position of a first target in a three-dimensional simulation environment based on pixel coordinates of the first target in a first image and state information of the drone model and the optoelectronic pod model.
Fig. 7 is a schematic structural diagram of an emulation apparatus for drone target location according to an embodiment of the present invention. This embodiment is a more sophisticated embodiment than the previous embodiments. As shown in fig. 7, the simulation apparatus for drone target positioning includes: an image acquisition module 701, an image processing module 702, a data acquisition module 703, a calculation module 704, a tracking module 705 and a comparison module 706.
An image acquisition module 701 is configured to acquire a first image including a first target to be positioned in real time.
The module selects a first target to be positioned from a plurality of images acquired by the optoelectronic pod model. For example, when a target to be positioned appears in a plurality of images acquired by the photoelectric pod model in real time, the target is selected as a first target by manually framing the target. And tracking the first target based on a tracking program of a kernel correlation filtering algorithm after the first target to be positioned is found. And in the process of acquiring the first image in real time by the photoelectric pod model carried by the unmanned aerial vehicle model, adjusting the state information of the unmanned aerial vehicle model and the photoelectric pod model in real time to enable the first image to comprise the first target. The state information of the unmanned aerial vehicle model and the photoelectric pod model comprises: the position of the unmanned aerial vehicle model, the attitude of the unmanned aerial vehicle model, the position of the photoelectric pod model and the attitude of the photoelectric pod model.
An image processing module 702, configured to perform image processing on the first image to obtain a pixel coordinate of the first target in the first image.
The module is used for converting the image format of the first image to obtain a second image in the image format supported by a cross-platform computer vision library (opencv). And (3) in a cross-platform computer vision library (opencv), carrying out image processing on the second image to obtain the pixel coordinates of the first target in the first image.
And the data acquisition module 703 is used for acquiring the state information of the unmanned aerial vehicle model and the photoelectric pod model in real time.
The module acquires state information of the unmanned aerial vehicle model and the photoelectric pod model. The state information of the unmanned aerial vehicle model and the photoelectric pod model comprises: the position of the unmanned aerial vehicle model, the attitude of the unmanned aerial vehicle model, the position of the photoelectric pod model and the attitude of the photoelectric pod model.
A calculating module 704, configured to calculate, in real-time simulation, a simulation value of the geographic position of the first target in the three-dimensional simulation environment based on the pixel coordinates and the state information.
The module is used for carrying out coordinate system transformation on the pixel coordinate of the first target in the first image based on the pixel coordinate of the first target in the first image and the state information of the unmanned aerial vehicle model and the photoelectric pod model in the three-dimensional simulation environment to obtain a simulated value of the geographic position of the first target in the three-dimensional simulation environment. For example, simulated values of the geographical position of the first target in the three-dimensional simulation environment are calculated by importing into the three-dimensional simulation environment previously written algorithms regarding positioning of the drone targets.
The tracking module 705 is configured to track the first target in real time through a tracking program of a kernel correlation filtering algorithm, and obtain a true position value of a geographic position of the first target in the three-dimensional simulation environment in real time.
The module tracks the first target in real time through a tracking program of a kernel correlation filtering algorithm, and obtains a position true value of the geographical position of the first target in the three-dimensional simulation environment in real time.
A comparing module 706 configured to compare the location true value and the simulation value to determine a simulation error of the drone model locating the geographic location of the first target.
The module compares the location true value and the simulated value to determine a simulated error of the geographic location of the first target. And analyzing the simulation error, and adjusting and optimizing parameters of an algorithm for unmanned aerial vehicle target positioning.
Fig. 8 is a structural diagram of a simulation control apparatus for drone target location according to an embodiment of the present invention. The apparatus shown in fig. 8 is only an example and should not limit the functionality and scope of use of embodiments of the present invention in any way.
Referring to fig. 8, the apparatus includes a processor 801, a memory 802, and an input-output device 803 connected by a bus. The memory 802 includes a Read Only Memory (ROM) and a Random Access Memory (RAM), and various computer instructions and data required to perform system functions are stored in the memory 802, and the processor 801 reads the various computer instructions from the memory 802 to perform various appropriate actions and processes. An input/output device including an input portion of a keyboard, a mouse, and the like; an output section including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section including a hard disk and the like; and a communication section including a network interface card such as a LAN card, a modem, or the like. The memory 802 further stores the following computer instructions to perform the operations specified by the simulation method for drone target location of the present invention: when the unmanned aerial vehicle model flies, acquiring a first image comprising a first target to be positioned through a photoelectric pod model carried on the unmanned aerial vehicle model; performing image processing on the first image to obtain a pixel coordinate of the first target in the first image; acquiring state information of the unmanned aerial vehicle model and the photoelectric pod model; and calculating a simulated value of the geographic position of the first target in the three-dimensional simulation environment based on the pixel coordinates and the state information.
Accordingly, an embodiment of the present invention provides a computer-readable storage medium, where computer instructions are stored, and when executed, the computer instructions implement the operations specified in the simulation method for target positioning of a drone.
The flowcharts and block diagrams in the figures and block diagrams illustrate the possible architectures, functions, and operations of the systems, methods, and apparatuses according to the embodiments of the present invention, and may represent a module, a program segment, or merely a code segment, which is an executable instruction for implementing a specified logical function. It should also be noted that the executable instructions that implement the specified logical functions may be recombined to create new modules and program segments. The blocks of the drawings, and the order of the blocks, are thus provided to better illustrate the processes and steps of the embodiments and should not be taken as limiting the invention itself.
The above description is only a few embodiments of the present invention, and is not intended to limit the present invention, and various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (14)

1. A simulation method for unmanned aerial vehicle target positioning is characterized in that an unmanned aerial vehicle model is built in a three-dimensional simulation environment, and the simulation method for unmanned aerial vehicle target positioning comprises the following steps:
when the unmanned aerial vehicle model flies, acquiring a first image comprising a first target to be positioned through a photoelectric pod model carried on the unmanned aerial vehicle model;
performing image processing on the first image to obtain a pixel coordinate of the first target in the first image;
acquiring state information of the unmanned aerial vehicle model and the photoelectric pod model; and
based on the pixel coordinates and the state information, a simulated value of the geographic location of the first target in the three-dimensional simulation environment is calculated.
2. The method for simulating unmanned aerial vehicle target positioning according to claim 1, further comprising:
obtaining a position true value of the geographic position of the first target in the three-dimensional simulation environment;
comparing the location true value and the simulated value to determine a simulated error of the geographic location of the first target.
3. The method for simulating unmanned aerial vehicle target positioning according to claim 2, wherein the acquiring a first image including a first target to be positioned through an optoelectronic pod model mounted thereon while the unmanned aerial vehicle model is flying comprises:
selecting the first target to be positioned from a plurality of images acquired by the photoelectric pod model;
tracking the first target based on a kernel correlation filtering algorithm tracking program; and
and in the process of acquiring the first image in real time by the photoelectric pod model carried by the unmanned aerial vehicle model, adjusting the state information of the unmanned aerial vehicle model and the photoelectric pod model in real time to enable the first image to comprise the first target.
4. The method of claim 3, wherein the image processing of the first image to obtain the pixel coordinates of the first target in the first image comprises:
converting the image format of the first image to obtain a second image;
and in a cross-platform computer vision library, carrying out image processing on the second image to obtain the pixel coordinates of the first target in the first image.
5. The method of claim 4, wherein said calculating a simulated value of a geographic position of said first target in said three-dimensional simulation environment based on said pixel coordinates and said state information comprises:
and based on the pixel coordinates and the state information, performing coordinate system transformation on the pixel coordinates of the first target in the first image to obtain a simulated value of the geographic position of the first target in the three-dimensional simulation environment.
6. The method for simulating unmanned aerial vehicle target positioning according to claim 5, wherein the state information of the unmanned aerial vehicle model and the optoelectronic pod model comprises: the position of the unmanned aerial vehicle model, the attitude of the unmanned aerial vehicle model, the position of the photoelectric pod model and the attitude of the photoelectric pod model.
7. The utility model provides an unmanned aerial vehicle target positioning's simulation device which characterized in that includes:
the image acquisition module is used for acquiring a first image comprising a first target to be positioned in real time;
the image processing module is used for carrying out image processing on the first image to obtain the pixel coordinates of the first target in the first image;
the data acquisition module is used for acquiring the state information of the unmanned aerial vehicle model and the photoelectric pod model in real time; and
and the calculating module is used for calculating a simulation value of the geographical position of the first target in the three-dimensional simulation environment in a real-time simulation mode based on the pixel coordinates and the state information.
8. The drone target positioning simulation device of claim 7, further comprising:
the tracking module is used for tracking the first target in real time through a tracking program of a nuclear correlation filtering algorithm and obtaining a position true value of the geographical position of the first target in the three-dimensional simulation environment in real time;
a comparison module for comparing the location true value and the simulation value to determine a simulation error of the unmanned aerial vehicle model locating the geographic location of the first target.
9. The drone target positioning simulation device of claim 8, wherein the real-time acquisition of the first image including the first target to be positioned includes:
selecting the first target to be positioned from a plurality of images acquired in real time;
tracking the first target based on a kernel correlation filtering algorithm tracking program; and
adjusting the state information of the unmanned aerial vehicle model and the optoelectronic pod model in real time during the acquisition of the first image in real time so that the first image includes the first target.
10. The apparatus of claim 9, wherein the image processing of the first image to obtain pixel coordinates of the first target in the first image comprises:
converting the image format of the first image to obtain a second image;
and in a cross-platform computer vision library, carrying out image processing on the second image to obtain the pixel coordinates of the first target in the first image.
11. The drone target positioning simulation device of claim 10, wherein the real-time simulation calculating simulated values of the geographic location of the first target in the three-dimensional simulation environment based on the pixel coordinates and the status information comprises:
and performing coordinate system transformation on the pixel coordinates of the first target in the first image based on the pixel coordinates and the state information, and calculating a simulation value of the geographical position of the first target in the three-dimensional simulation environment in real time.
12. The drone target positioning simulation device of claim 11, wherein the state information of the drone model and the optoelectronic pod model includes: the position of the unmanned aerial vehicle model, the attitude of the unmanned aerial vehicle model, the position of the photoelectric pod model and the attitude of the photoelectric pod model.
13. A computer-readable storage medium, characterized in that it stores computer instructions that, when executed, implement the method of simulation of drone target localization according to any one of claims 1 to 6.
14. The utility model provides an unmanned aerial vehicle target location's emulation controlling means which characterized in that includes:
a memory for storing computer instructions;
a processor coupled to the memory, the processor configured to execute a simulation method implementing the drone target location of any of claims 1 to 6 based on computer instructions stored by the memory.
CN202010035235.XA 2020-01-14 2020-01-14 Simulation method and device for unmanned aerial vehicle target positioning Pending CN113189890A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010035235.XA CN113189890A (en) 2020-01-14 2020-01-14 Simulation method and device for unmanned aerial vehicle target positioning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010035235.XA CN113189890A (en) 2020-01-14 2020-01-14 Simulation method and device for unmanned aerial vehicle target positioning

Publications (1)

Publication Number Publication Date
CN113189890A true CN113189890A (en) 2021-07-30

Family

ID=76972280

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010035235.XA Pending CN113189890A (en) 2020-01-14 2020-01-14 Simulation method and device for unmanned aerial vehicle target positioning

Country Status (1)

Country Link
CN (1) CN113189890A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090087029A1 (en) * 2007-08-22 2009-04-02 American Gnc Corporation 4D GIS based virtual reality for moving target prediction
US20120075168A1 (en) * 2010-09-14 2012-03-29 Osterhout Group, Inc. Eyepiece with uniformly illuminated reflective display
CN106777489A (en) * 2016-11-22 2017-05-31 中国人民解放军陆军军官学院 UAV system opto-electric stabilization turntable tracks state modeling and simulating method
CN108279576A (en) * 2017-12-26 2018-07-13 湖北航天技术研究院总体设计所 A kind of composite shaft target following emulation test system
CN110347035A (en) * 2018-04-08 2019-10-18 北京京东尚科信息技术有限公司 Method for autonomous tracking and device, electronic equipment, storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090087029A1 (en) * 2007-08-22 2009-04-02 American Gnc Corporation 4D GIS based virtual reality for moving target prediction
US20120075168A1 (en) * 2010-09-14 2012-03-29 Osterhout Group, Inc. Eyepiece with uniformly illuminated reflective display
CN106777489A (en) * 2016-11-22 2017-05-31 中国人民解放军陆军军官学院 UAV system opto-electric stabilization turntable tracks state modeling and simulating method
CN108279576A (en) * 2017-12-26 2018-07-13 湖北航天技术研究院总体设计所 A kind of composite shaft target following emulation test system
CN110347035A (en) * 2018-04-08 2019-10-18 北京京东尚科信息技术有限公司 Method for autonomous tracking and device, electronic equipment, storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
程晓磊: "无人机快递对终端配送职业体系的影响", 物流技术与应用, 10 August 2017 (2017-08-10) *

Similar Documents

Publication Publication Date Title
CN107742093B (en) Real-time detection method, server and system for infrared image power equipment components
CN106651949B (en) Space manipulator target capturing teleoperation method and system based on simulation
US20210074012A1 (en) Dynamic scene three-dimensional reconstruction method, apparatus and system, server, and medium
EP3171236A1 (en) Simulator, simulation method, and simulation program
CN111401146A (en) Unmanned aerial vehicle power inspection method, device and storage medium
CN110969687B (en) Collision detection method, device, equipment and medium
CN115578433B (en) Image processing method, device, electronic equipment and storage medium
CN115563732A (en) Spraying track simulation optimization method and device based on virtual reality
Kim et al. As-is geometric data collection and 3D visualization through the collaboration between UAV and UGV
CN115793690A (en) Indoor inspection method, system and equipment for unmanned aerial vehicle
CN111812978A (en) Cooperative SLAM method and system for multiple unmanned aerial vehicles
CA3120722C (en) Method and apparatus for planning sample points for surveying and mapping, control terminal and storage medium
CN114494435A (en) Rapid optimization method, system and medium for matching and positioning of vision and high-precision map
Yu et al. Collaborative SLAM and AR-guided navigation for floor layout inspection
CN116243623B (en) Robot scene simulation method applied to digital robot industrial chain
CN113189890A (en) Simulation method and device for unmanned aerial vehicle target positioning
CN113656918B (en) Four-rotor simulation test method applied to finished product overhead warehouse scene
CN116466586A (en) Transformer network-based blocking target space-ground collaborative tracking method
Vidimlic et al. Image Synthesisation and Data Augmentation for Safe Object Detection in Aircraft Auto-landing System.
Sokolov et al. Technology of Developing the Software for Robots Vision Systems.
CN116542847B (en) Low-small slow target high-speed image simulation method, storage medium and device
CN116148883B (en) SLAM method, device, terminal equipment and medium based on sparse depth image
CN117806496B (en) Comprehensive pipe rack dynamic virtual inspection method and system based on virtual reality technology
Fu et al. Edge Computing Driven Scene-aware Intelligent Augmented Reality for Manual Assembly
Mu et al. Design of Tank Inspection Robot Navigation System Based on Virtual Reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination