CN116068928A - Distributed heterogeneous unmanned aerial vehicle cluster air-ground integrated control system and method - Google Patents

Distributed heterogeneous unmanned aerial vehicle cluster air-ground integrated control system and method Download PDF

Info

Publication number
CN116068928A
CN116068928A CN202211477244.XA CN202211477244A CN116068928A CN 116068928 A CN116068928 A CN 116068928A CN 202211477244 A CN202211477244 A CN 202211477244A CN 116068928 A CN116068928 A CN 116068928A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
information
target
ground station
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211477244.XA
Other languages
Chinese (zh)
Inventor
徐振涛
郑智辉
丛龙剑
姚征
栾健
周娟
禹春梅
张志良
郭海雷
闫威
郭宸瑞
王硕
董昊天
张俊明
闫新颖
康旭冰
贾玉姣
谭亚雄
周帅军
丁吉
唐君
周华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Aerospace Automatic Control Research Institute
Original Assignee
Beijing Aerospace Automatic Control Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Aerospace Automatic Control Research Institute filed Critical Beijing Aerospace Automatic Control Research Institute
Priority to CN202211477244.XA priority Critical patent/CN116068928A/en
Publication of CN116068928A publication Critical patent/CN116068928A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • G05B19/0423Input/output
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/25Pc structure of the system
    • G05B2219/25257Microcontroller
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides a control system and method for a distributed heterogeneous unmanned aerial vehicle cluster, and relates to the technical field of unmanned aerial vehicle control, so as to improve intelligent perception capability of the unmanned aerial vehicle cluster in a complex environment. The system comprises: the bee group controller is used for calculating the local flight path and acceleration information according to the state information of other unmanned aerial vehicles in the unmanned aerial vehicle cluster and sending the local flight path and acceleration information to the information processing board; the information processing board is used for receiving video frames of the optical pod and the optical type load and detecting data of the detection type load, identifying a target in the video frames by utilizing an artificial intelligence algorithm and sending the video frames to the ground station; and receiving the local flight path and acceleration information calculated by the bee colony controller and sending the information to the flight controller so as to control the local unmanned aerial vehicle to complete the flight task in the autonomous cooperation cluster. The control system of the distributed heterogeneous unmanned aerial vehicle cluster is applied to the control method of the distributed heterogeneous unmanned aerial vehicle cluster.

Description

Distributed heterogeneous unmanned aerial vehicle cluster air-ground integrated control system and method
Technical Field
The invention relates to the technical field of unmanned aerial vehicle control, in particular to a distributed heterogeneous unmanned aerial vehicle cluster air-ground integrated control system and method.
Background
With the continuous development of technology and the continuous perfection of military theory, unmanned aerial vehicles have gradually become one of the indispensable equipment in modern battlefields. The battlefield environment of the 21 st century will have a multi-dimensional, multi-modal, integrated complex morphology requiring the ability of the scout intelligence equipment and weapon striking equipment to analyze and understand the complex battlefield environment. Although the flight tasks of the single unmanned aerial vehicle can be solved by simply combining the guidance control method through the deep learning technology, a series of collaborative tasks such as target allocation, group decision, track planning and the like can be still faced when the unmanned aerial vehicle cluster executes the flight tasks, and the maximization of the cluster flight efficiency can be realized, and the completion of the tasks all needs collaborative detection sensing capability with group consistency. In addition, under the condition of complex tasks, the capacity of a single-load unmanned aerial vehicle group for completing the tasks is very limited, and the trend of unmanned aerial vehicle combat is configuration and load diversification.
Key technologies that need to be solved to actually achieve complete autonomous control of clusters mainly include: environmental awareness and awareness, multi-machine collaborative task planning and decision-making, information interaction and autonomous control, and man-machine intelligent fusion and self-adaptive learning technology. In addition, the key to make such unmanned aerial vehicle clusters practical is the ground station software that assists the operator in performing the cluster operation. Patent document CN107831783a is a ground station control system supporting autonomous flight of multiple unmanned aerial vehicles, which has the following drawbacks: (1) lack of ability to control simultaneous operation of heterogeneous drones; (2) The system parameter configuration is needed to be carried out on all unmanned aerial vehicles before taking off, and the deployment is complicated; (3) The task allocation is sent to each unmanned aerial vehicle after being planned in a centralized way on the ground, and the flight path of the unmanned aerial vehicle cannot be modified in the flight process, so that the task which is already issued cannot be modified according to actual conditions; (4) The unmanned plane cannot carry out real-time track correction according to the change of actual conditions during flight. (5) The unmanned aerial vehicle is not matched with a detector or other loads mounted on the unmanned aerial vehicle, only the unmanned aerial vehicle group can be controlled to complete a flight task, and the tasks such as reconnaissance detection cannot be performed; (6) The man-machine interaction capability is insufficient, and the ground station software can only be used as a remote controller for unmanned aerial vehicle flight.
Disclosure of Invention
In order to solve the technical problems, the invention provides a distributed heterogeneous unmanned aerial vehicle cluster air-ground integrated control system and method, which are used for identifying targets in a nacelle picture through AI so as to improve intelligent perception capability of the unmanned aerial vehicle cluster in a complex environment, and simultaneously, the distributed heterogeneous unmanned aerial vehicle integrated control system is connected with airborne equipment such as an optical nacelle, a flight controller, a bee colony controller and the like so as to realize collaborative reconnaissance task control of the distributed heterogeneous unmanned aerial vehicle.
The invention provides a distributed heterogeneous unmanned aerial vehicle cluster air-ground integrated control system, which comprises: a plurality of unmanned aerial vehicles and a ground station in communication with the plurality of unmanned aerial vehicles, wherein,
each of the plurality of unmanned aerial vehicles includes:
the bee group controller is used for calculating the local flight path and acceleration information according to the state information of other unmanned aerial vehicles in the unmanned aerial vehicle cluster and sending the local flight path and acceleration information to the information processing board;
the flight controller is used for controlling the gesture of the unmanned aerial vehicle and adjusting the speed and the flight direction of the unmanned aerial vehicle;
the photoelectric pod is used for shooting video frames in real time and tracking targets;
the information processing board is used for receiving video frames of the optical pod and the optical type load and detecting data of the detection type load, identifying a target in the video frames by utilizing an artificial intelligence algorithm and sending the video frames to the ground station; and receiving the local flight path and acceleration information calculated by the bee colony controller and sending the information to the flight controller so as to control the local unmanned aerial vehicle to complete the flight task in the autonomous cooperative coordination cluster.
Preferably, the information processing board includes: the data access module is used for:
receiving telemetry data of a flight controller as a flight control information topic, and sending the flight control information topic to a ground station node so as to monitor unmanned plane state information through a ground station; creating unmanned aerial vehicle nodes, and subscribing task input topics published by ground station nodes based on the unmanned aerial vehicle nodes so that all unmanned aerial vehicles in the cluster receive the task input topics; the method comprises the steps that a bee group control topic is established based on unmanned aerial vehicle nodes, the bee group control topics of other unmanned aerial vehicles in a cluster are subscribed, the bee group control topics of the other unmanned aerial vehicles are sent to a bee group controller, and the bee group controller calculates local flight path and acceleration information according to the bee group control topics of the other unmanned aerial vehicles so as to control the local unmanned aerial vehicles to autonomously cooperate with the other unmanned aerial vehicles in the cluster to complete flight tasks.
Preferably, the information processing board includes:
the cooperative control module is used for:
and activating the bee colony controller to perform initial track calculation according to the task input topics.
Preferably, the information processing board includes: the machine carries perception module, machine carries perception module and is used for:
and identifying and tracking the target in the video frame based on an artificial intelligence identification model and a tracking algorithm, superposing an identification frame in the video frame to envelop the identified target, endowing the same target with the same target ID to the identified same target, encoding the video frame superposed with the identification frame and the endowed target ID, and pushing the video frame to a ground station.
Preferably, the information processing board includes: the human-computer interaction module is used for:
the information processor identifies a target in continuous video frames shot by the photoelectric pod, marks and sends the target to the ground station, and an operator locks the target by clicking the target on the ground station, tracks the target and sends the locking position to the ground station; or, the operator selects the identified target at the ground station frame, tracks the selected target at the ground station frame, and transmits the locked position to the ground station.
Preferably, the plurality of unmanned aerial vehicles comprise unmanned aerial vehicles with various configurations, and a distributed control structure is formed among the unmanned aerial vehicles with the various configurations.
Preferably, the ground station is configured to:
receiving real-time positions and tracks of the unmanned aerial vehicle, and displaying real-time state information of all unmanned aerial vehicles;
entering a task information input picture before the unmanned aerial vehicle flies, and setting task information in the task information input picture, wherein the task information comprises the number of unmanned aerial vehicles, cruising areas and no-fly areas, and the task information is sent to a bee colony controller through a data link through an information processing board to carry out task decision;
during the investigation, the target is locked and the photoelectric pod is controlled to track the target or the tracking and locking of the photoelectric pod to the target are cancelled.
Compared with the prior art, the distributed heterogeneous unmanned aerial vehicle cluster air-ground integrated control system provided by the invention has the following beneficial effects: the bee group controller calculates the information of the local flight path and the acceleration according to the state information of other unmanned aerial vehicles in the unmanned aerial vehicle cluster, and sends the information of the local flight path and the acceleration to the information processing board; the information processing board receives the local flight path and acceleration information calculated by the bee colony controller and sends the information to the flight controller, and the flight controller controls the gesture of the unmanned aerial vehicle and adjusts the speed and the flight direction of the unmanned aerial vehicle so as to control the unmanned aerial vehicle to complete the flight task in the autonomous cooperative coordination cluster. The collaborative reconnaissance task control of the distributed heterogeneous unmanned aerial vehicle is realized.
The invention also provides a distributed heterogeneous unmanned aerial vehicle cluster air-ground integrated control method, which comprises the following steps: step S1: the bee group controller calculates the information of the local flight path and the acceleration according to the state information of other unmanned aerial vehicles in the unmanned aerial vehicle cluster, and sends the information of the local flight path and the acceleration to the information processing board; step S2: the information processing board receives video frames of the optical pod and the optical type load and detection data of the detection type load, identifies a target in the video frames by utilizing an artificial intelligence algorithm, and sends the video frames to the ground station; and receiving the local flight path and acceleration information calculated by the bee colony controller and sending the information to the flight controller so as to control the local unmanned aerial vehicle to complete the flight task in the autonomous cooperation cluster.
Compared with the prior art, the beneficial effects of the distributed heterogeneous unmanned aerial vehicle cluster space integrated control method provided by the invention are the same as those of the distributed heterogeneous unmanned aerial vehicle cluster space integrated control system described in the technical scheme, and the detailed description is omitted.
In order to make the above objects, features and advantages of the present invention more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 shows a schematic structural diagram of a distributed heterogeneous unmanned aerial vehicle cluster air-ground integrated control system provided by an embodiment of the invention;
fig. 2 shows a hardware connection architecture diagram of a distributed heterogeneous unmanned aerial vehicle cluster space-ground integrated control system provided by an embodiment of the invention;
FIG. 3 illustrates an autonomous cooperative control scheme provided by an embodiment of the present invention;
fig. 4 shows a schematic diagram of unmanned aerial vehicle status information collection provided by an embodiment of the present invention;
FIG. 5 illustrates a schematic diagram of a scout destination lock provided by an embodiment of the present invention;
FIG. 6 is a schematic diagram of an initialization interface provided by an embodiment of the present invention;
FIG. 7 illustrates a map interface schematic provided by an embodiment of the present invention;
fig. 8 shows a schematic diagram of a no-fly zone setting interface provided by an embodiment of the present invention;
fig. 9 shows a schematic diagram of a zero setting interface provided by an embodiment of the present invention;
FIG. 10 illustrates a nacelle visual interface schematic provided by an embodiment of the invention;
fig. 11 shows a nacelle identification screen ground test schematic provided by an embodiment of the invention.
Detailed Description
In the description of the present invention, it should be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present invention, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
The term "plurality" as used in this embodiment means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone. The words "exemplary" or "such as" are used to mean serving as an example, instance, or illustration, intended to present concepts related in a specific manner, and should not be interpreted as being preferred or advantageous over other embodiments or designs.
Fig. 1 shows a schematic structural diagram of a distributed heterogeneous unmanned aerial vehicle cluster space integrated control system provided by the embodiment of the invention, fig. 2 shows a hardware connection structure diagram of the distributed heterogeneous unmanned aerial vehicle cluster space integrated control system provided by the embodiment of the invention, and fig. 3 shows a software structure diagram of the distributed heterogeneous unmanned aerial vehicle cluster space integrated control system provided by the embodiment of the invention.
As shown in fig. 1-3, an embodiment of the present invention provides an air-ground integrated control system of a distributed heterogeneous unmanned aerial vehicle cluster, where the system includes:
a plurality of drones, and a ground station 10 communicatively coupled to the plurality of drones, wherein,
each of the plurality of unmanned aerial vehicles includes:
the bee group controller 60 is configured to calculate a local track and acceleration information according to status information of other unmanned aerial vehicles in the unmanned aerial vehicle cluster, and send the local track and acceleration information to the information processing board;
a flight controller 50 for controlling the attitude of the unmanned aerial vehicle and adjusting the speed and flight direction of the unmanned aerial vehicle;
a photoelectric pod 40 for capturing video frames in real time and performing target tracking;
an information processing board 30 for receiving video frames for receiving optical pods and optical-type payloads and probe data for probing the type payloads, identifying targets in the video frames using an artificial intelligence algorithm, and transmitting the video frames to a ground station; the local track and acceleration information calculated by the bee colony controller 60 is received and sent to the flight controller 50 to control the unmanned aerial vehicle in the autonomous cooperative cluster to complete the flight task.
It should be noted that, the ground station 10 may be a portable computer or a touch screen mobile terminal. The ground station 10 is connected to an on-board information processing board 30 via an ad hoc network data link 20 and a portal. All information processing boards 30 in the cluster and the ground station 10 communicate through the data link 20 to form a wireless domain network without a center, so as to perform space-to-ground information interaction with the machine. It should be appreciated that the plurality of unmanned aerial vehicles includes unmanned aerial vehicles of a plurality of configurations, with a distributed control structure formed between the unmanned aerial vehicles of the plurality of configurations. In particular, the information processing board 30 receives video data of the optoelectronic pod via a video interface, it being understood that the video interface may be MIPI. The information processing board 30 performs data interaction with the optoelectronic pod, such as CAN, through a serial port. The information processing board 30 receives telemetry information, such as RS422, of the flight controller 50 through the serial port, and sends waypoint instructions and acceleration information to the flight controller 50 to control the unmanned aerial vehicle to complete formation flight and collaborative detection. The information processing board 30 is connected with the bee group controller 60 through a serial port, for example, RS422, sends necessary bee group control physical quantities such as inter-machine information, unmanned plane state information and the like to the bee group controller 60, receives local flight path and acceleration information calculated by the bee group controller 60, and forwards the calculated local flight path and acceleration information to the flight controller 50 to control other unmanned planes in the local coordination cluster to complete tasks.
Further, ground station software is arranged in the ground station equipment terminal, and is built on the basis of the ROS system by using a Qt frame and C++ language, and a man-machine interaction graphical interface is provided.
Specifically, ground station software is arranged in the ground station equipment terminal and is used for realizing the following functions:
(1) And (3) a connecting function. And establishing connection with the ground station, customizing the number of the aircraft performing the task, and establishing connection between the ground station and the airborne information processing board.
(2) Entering a task information input picture before the unmanned aerial vehicle flies, and setting task information in the task information input picture, wherein the task information comprises the number of unmanned aerial vehicles, cruising areas and no-fly areas, and the task information is sent to a bee colony controller through a data link through an information processing board to carry out task decision. Specifically, the connection is established with the bee group controller through the transit of the information processing board, and the information processing board is provided with enabling operations such as controlling the start of a task and the end of the task. The method comprises the steps that before the unmanned aerial vehicle takes off, task data such as an unmanned aerial vehicle cruising area, a no-fly area, a flying height range and the like are set through a map interface of a ground station, and are forwarded to a bee colony controller through an information processing board, so that task decision work is started. The ground station can also cancel the current task by sending an instruction to the information processing board by the ground station, which is then forwarded to the bee colony controller to cancel the current task. After the task is canceled, the unmanned aerial vehicle will continue to maintain the flight state until a new task or return instruction is sent.
(3) In the process of performing a reconnaissance task in the unmanned aerial vehicle flight, the real-time positions and tracks of the unmanned aerial vehicle are received, and the ground station map interface displays the real-time positions and the passed tracks of all unmanned aerial vehicles. Furthermore, real-time status information of all unmanned aerial vehicles in the information bar display cluster, for example: precision, dimension, height, speed, roll, yaw, pitch, etc.
(4) The ground station decodes the RTSP video stream pushed by the onboard information processing board, displays the scout video shot by the photoelectric pod, and superimposes the target identification frame and the target corresponding ID number identified by the information processing board.
(5) During reconnaissance, the target may be locked by the ground station and tracked by controlling the optoelectronic pod. Specifically, the locking mode is classified into frame selection locking and click locking. The frame selection locking is that an operator draws a rectangular frame to frame the target on a video interface, and the photoelectric pod locks and tracks the target in the frame selection. The click locking is that an operator clicks an ID number for identifying the target, and the photoelectric pod locks and tracks the target corresponding to the ID number. In addition, the operator can cancel tracking and locking of the photoelectric pod to the target through the ground station.
Further, the information processing board 30 includes:
a data access module 302, the data access module 302 being configured to: receiving telemetry data of the flight controller 50 as a flight control information topic, and sending the flight control information topic to a ground station node so as to monitor unmanned plane state information through the ground station; creating unmanned aerial vehicle nodes, and subscribing task input topics published by ground station nodes based on the unmanned aerial vehicle nodes so that all unmanned aerial vehicles in the cluster receive the task input topics; the method comprises the steps of establishing a bee group control topic based on unmanned aerial vehicle nodes, subscribing the bee group control topics of other unmanned aerial vehicles in the cluster, sending the bee group control topics of the other unmanned aerial vehicles to the bee group controller 60, and calculating a local flight path and acceleration information by the bee group controller 60 according to the bee group control topics of the other unmanned aerial vehicles so as to control the local unmanned aerial vehicles to complete flight tasks in an autonomous cooperative mode.
Specifically, as shown in fig. 2, data information generated by components of flight controller 50, bee colony controller 60, optical pod 40, etc. is read by serial ports in a specified communication protocol and issued to the correct ports, such as serial ports in fig. 2 connected to pod driver 3023, flight control driver 3022, and bee colony control driver 3021, data in on-board device readable ports. The data access module 302 may also access video output interfaces such as MIPI and SDI, and the region acquisition device inputs an image, identifies a target in a video by using an intelligent visual algorithm in the servo control module 3024, then issues an image topic, and issues a video stream with a superimposed identification result. It should be appreciated that the vision servo module 3024 implements multi-target detection and tracking functions using intelligent vision algorithms to control the optical pod 40 to lock onto a target and allow the drone to track it after the ground station selects the target.
As shown in fig. 4, in the flight process, the unmanned plane node issues a flight control information topic, the content in the topic is local flight telemetry information periodically read from the flight controller, and the information is packaged and packaged into telemetry data packets according to a certain communication protocol. The flight control information topics are subscribed by the ground station nodes, after the ground station software acquires telemetry, the telemetry is unpacked, information such as airplane speed, longitude and latitude height, acceleration, electric quantity and the like is acquired, and the unmanned aerial vehicle from which the data packet comes is judged according to unmanned aerial vehicle ID information contained in the data packet. The ground station displays the information of the aircraft in the corresponding display column. Meanwhile, longitude and latitude information of the unmanned aerial vehicle is resolved into two-dimensional coordinates, unmanned aerial vehicle icons are displayed on corresponding two-dimensional coordinates of the offline map, and colors of the unmanned aerial vehicle icons displayed by different unmanned aerial vehicles are different. Therefore, the ground station monitors the state information of the unmanned aerial vehicle, and the real-time position of each unmanned aerial vehicle is observed through the map.
Further, as shown in fig. 2, the information processing board 30 includes:
a cooperative control module 301, where the cooperative control module 301 is configured to; and activating the bee colony controller to perform initial track calculation according to the task input topics.
Specifically, as shown in fig. 3, taking 3 unmanned aerial vehicles as an example, the ground station software establishes a ground station node that issues a task input topic. After the commander inputs information such as the number of aircrafts, the mission area, the cruising altitude, the no-fly zone and the like on the ground station, the information generates a data packet and loads the data packet to the mission input topics. All unmanned aerial vehicles executing tasks create unmanned aerial vehicle nodes, the nodes subscribe to task input topics from the ground, then received data packets are sent to ports connected with the information processing board and the bee group controller, and the bee group controller obtains task input information through the ports. The information transmission mechanism enables the task planning information issued by the ground station to be simultaneously received by all unmanned aerial vehicles, and has the enabling function of enabling the bee colony controller to start initial task planning. And after the bee colony controller receives the task planning information, carrying out distributed task planning calculation, and calculating the initial flight path of each aircraft. Meanwhile, unmanned plane nodes release bee colony control topics. The content of the bee group control topics comes from a port connected with the bee group controller by the information processing board, and contains task planning information generated by the bee group controller. In addition, the content of the bee colony control topics also comprises local state information read by the information processing board from a port connected with the flight control. Unmanned aerial vehicle nodes built by unmanned aerial vehicles subscribe to the bee group control topics published by all other unmanned aerial vehicles in the cluster, and the content in the topics is transferred to the bee group controller through a serial port. The information transmission mechanism can form an inter-aircraft state information sharing network, and the bee group controller can perform online distributed computation according to the real-time states of all the aircraft in the cluster. The result of task planning can enable the unmanned aerial vehicle cluster to complete the flying actions such as formation maintenance, formation transformation, intelligent obstacle avoidance, distributed flying and the like according to task requirements, generate corresponding control instructions or information aircraft routes, forward the corresponding control instructions or information aircraft routes to the local flight control through the information processing board, and control the aircraft to change the flying gesture, direction, speed, height or flight path.
Further, the information processing board 30 further includes: an on-board perception module 303 for: and identifying and tracking the target in the video frame based on an artificial intelligence identification model and a tracking algorithm, superposing an identification frame in the video frame to envelop the identified target, endowing the same target with the same target ID to the identified same target, encoding the video frame superposed with the identification frame and the endowed target ID, and pushing the video frame to a ground station.
Further, the information processing board 30 further includes: the man-machine interaction module 304, the man-machine interaction module 304 is used for being responsible for air-ground data communication, and is connected with ground station software and communication networks of interaction modules of other unmanned aerial vehicles in the cluster through a data link network port. And issuing the received ground station instruction to a corresponding port for other equipment or processes to call and develop corresponding functions. And meanwhile, receiving feedback information of other processes and periodically transmitting the feedback information to the ground station. The information processor identifies a target in continuous video frames shot by the photoelectric pod, marks and sends the target to the ground station, and an operator locks the target by clicking the target on the ground station, tracks the target and sends the locking position to the ground station; or, the operator selects the identified target at the ground station frame, tracks the selected target at the ground station frame, and transmits the locked position to the ground station.
Specifically, as shown in fig. 5, a single aircraft is taken as an example. At the sky end, the information processing board establishes an image release node which releases two image topics, namely a high-resolution image and a compressed image respectively. After the image shot by the photoelectric pod is acquired by the image acquisition card, the image is read by the image release node, and two video topics with different resolutions are released. The high-resolution image is collected and stored by a storage device as important data of the flight record. The compressed image enters the object recognition node. The target recognition node operates a target recognition and tracking algorithm based on artificial intelligence, recognizes and tracks targets, such as different vehicles, personnel, bicycles and the like, appearing in the compressed video, superimposes a recognition frame in the video to envelop the recognized targets, and simultaneously gives the recognized targets with ID numbers which are different from each other in the same picture. After the identification work is completed, the target identification node encodes the video image overlapped with the identification frame and the ID number, and then downloads the video image to the ground station in an RTSP (real time streaming protocol) stream form through a data link, and simultaneously issues a target information topic. The target identification node periodically receives the state information of the photoelectric pod and issues a servo control topic, wherein the topic content comprises the state information of the photoelectric pod and the ID of the identification target. And simultaneously, establishing a ground servo control node which issues a servo control topic. The control of the electro-nacelle servo system, such as unlocking, electric lock, servo rotation, servo zero setting, servo state change and the like, is completed through different instructions, and after clicking a certain function button on the interface of the ground station, the servo control node sends a corresponding instruction to a servo control topic.
As shown in fig. 5, the ground station software has an RTSP stream decoding function, clicks the number of the unmanned aerial vehicle on the ground station interface, and can read the RTSP stream sent by the IP address of the unmanned aerial vehicle, and play the RTSP stream in real time after decoding the RTSP stream. The played video is a real-time picture shot by the nacelle, the target identification frame and the identification frame ID number are overlapped in the picture, and the ID numbers in the video appear below the video playing interface, and the ID numbers can be clicked and selected. An operator can lock the targets appearing in the picture while monitoring these videos. There are two locking modes, namely selective locking and frame selection locking. The selection locking is to lock the target by clicking an ID number button in the interface, and after clicking the ID number button, the servo control node sends the ID number to the servo control topic. The frame selection locking is to lock targets in a rectangular area selected by a frame in a video picture, and after the frame selection is completed, a servo control node calculates the central coordinate of the frame selection rectangle, namely the relative coordinate in an image, and sends the central coordinate to a servo control topic. The target identification node would subscribe to the servo control topic. If the received topic content is a servo control instruction, the instruction is sent to a serial port connected with the optoelectronic pod, and the optoelectronic pod receives the instruction through the serial port and completes corresponding servo control action. If the received topic content is the target ID number, an item target locking instruction is generated, and the center position of the target position corresponding to the ID number in the picture is sent to a serial port connected with the optoelectronic pod. If the received topic content is the central coordinate value of the rectangular frame, converting the coordinate into a coordinate position corresponding to the shooting picture of the photoelectric pod, generating a locking instruction, and transmitting the converted coordinate to a serial port connected with the photoelectric pod. After the optoelectronic pod receives the coordinate values, the target at the coordinate values is locked, and the visual servo module 3024 is controlled to lock the target. Meanwhile, the flight controller and the visual servo module 3024 correct the movement and rotation of the unmanned aerial vehicle and the visual servo module 3024 together, and track and lock the target.
As another possible implementation, the complete ground station starting, connecting the unmanned aerial vehicle, monitoring the scout screen, and targeting flows are as follows:
step 1: the drone and corresponding devices are ready as shown in figure 1. The format and communication protocol of data transmission between devices are formulated. In the implementation example of the invention, taking 4 unmanned aerial vehicles as an example, the airborne information processing board and the ground station computer environment are Ubuntu 18.04 operating system, the ROS version is ROS medium 1.14.10, and the ground station software is developed based on Qt 5.9.9.
Step 2: and initializing a task. And filling in the ROS main node IP address, the local IP address and the number of unmanned aerial vehicles performing tasks in the interface corresponding positions shown in the figure 6. And then subscribing the aircraft data with the corresponding quantity according to the corresponding quantity of the unmanned aerial vehicles after clicking connection.
Step 3: and displaying the state of the unmanned aerial vehicle. As shown in fig. 7. The interface displays an online map, four frames exist on the left side of the map, remote measurement information of four unmanned aerial vehicles can be checked at a time, and if the number of the aircrafts is greater than four, unmanned aerial vehicle data can be checked by switching the unmanned aerial vehicles through the pull-down frames on the frames. The middle map can be enlarged, reduced and dragged, and can be positioned according to the current IP address of the computer when being started.
Step 4: and setting a task area. Clicking the task area setup button in fig. 7 enters the task area setup interface as shown in fig. 8. When the mouse slides, the left lower corner displays longitude and latitude in real time; then after clicking a button in the right target area, clicking a point on the map, displaying longitude and latitude of the diagonal angle 1 (namely clicking the point) below the button, correspondingly changing the length and width area data of the mouse below the sliding position of the mouse on the map, and after clicking again, displaying longitude and latitude of the second click of the diagonal angle 2, wherein the length and width area is fixed; after the no-fly area is clicked, a red circle can be generated by clicking on the map, the radius of the circle can be changed by dragging on a white small square on a circle frame, and the data can be confirmed by checking the radius below the button in the dragging process; clicking redeployment can empty the no-fly area and the target area on the map; clicking the confirm task decision interface closes and clicking the bee colony controller take over button in the interface of fig. 7 uploads the data in the task decision.
Step 5: zero point setting. After the zero point is clicked on the interface of fig. 8, the relative zero point position of the flight task can be set, as shown in fig. 9, moving the mouse in the map can cause the change of longitude and latitude in the zero point setting interface, the changed value is the longitude and latitude pointed by the current mouse, and after clicking a certain point on the map, the coordinate of the point is locked. The coordinate value locked after the click determination is determined as the zero point.
Step 6: target detection and locking. Clicking on the pod screen interface button in fig. 6 presents an interface as shown in fig. 10. The lower right hand corner IP setup button needs to set the IP address of the drone. And displaying the guide head picture of RTSP push flow in the red frame, and switching the guide head pictures of other unmanned aerial vehicles through a picture switching button at the upper right corner. Play and pause are performed by the play and pause icon below. The manual control of nacelle servo is carried out through a servo control button on the right side, and the unlocking and locking of the servo are controlled by the electric unlocking and electric locking at the lower right corner. The ID number above the identified object in the screen is also displayed in the lower object ID. The lower right pod status display field displays the pod and servo information. After clicking the manual frame selection, the frame selection can be performed in the picture, and after finishing the frame selection, the position information of the frame selection target can be sent to the information processor to lock and track the target. The object recognition situation is shown in fig. 11.
Compared with the prior art, the distributed heterogeneous unmanned aerial vehicle cluster space integrated control system provided by the embodiment of the invention has the following beneficial effects:
(1) Based on ROS development, an air-ground integrated body is formed, part of functions of a software system can be transplanted into the air, and the calculation load of a ground station computer is reduced;
(2) The software and hardware architecture based on the ROS and the information processing board enables the distributed heterogeneous unmanned aerial vehicle cluster air-ground integrated control system to be suitable for any unmanned equipment capable of carrying the information processing board, heterogeneous cooperative control can be carried out after the unmanned equipment with different configurations is used for carrying the information processing board, and a plurality of unmanned aerial vehicles can be controlled to carry out cluster reconnaissance tasks.
(3) The parameter setting of a single unmanned aerial vehicle is not needed, a global-based task deployment mode is adopted, and the task decomposition is independently completed by the unmanned aerial vehicle, so that the deployment process is simplified.
(4) And a distributed task planning mode is adopted, and in the task execution process, the ground station can change the task, and the unmanned aerial vehicle rapidly responds to execute a new task.
(5) The unmanned aerial vehicle has the capability of autonomous real-time mission planning, and can correct the flight path according to the actual situation in the flight process. When the reconnaissance task is executed, the targets can be reasonably distributed, so that different unmanned aerial vehicles can track and lock different targets.
(6) The open realization multi-unmanned aerial vehicle reconnaissance picture passback switches for operating personnel can see the passback picture of all unmanned aerial vehicles when controlling unmanned aerial vehicle, can carry out global monitoring to the condition in task area, has saved the data link bandwidth again.
(7) The method has rich man-machine interaction process, can complete the selection and locking of the reconnaissance target through the ground station and based on an artificial intelligence algorithm, and confirms striking by a person in a loop mode. In addition, the method can also carry out frame selection on the specific target under the condition that the target is not identified by the artificial intelligence, thereby not only utilizing the convenience and universality of the rapid automatic identification of the artificial intelligence, but also improving the identification accuracy in a loop by people.
The embodiment of the invention provides a distributed heterogeneous unmanned aerial vehicle cluster air-ground integrated control method, which comprises the following steps:
step S1: the bee group controller calculates the information of the local flight path and the acceleration according to the state information of other unmanned aerial vehicles in the unmanned aerial vehicle cluster, and sends the information of the local flight path and the acceleration to the information processing board;
step S2: the information processing board receives video frames of the optical pod and the optical type load and detection data of the detection type load, identifies a target in the video frames by utilizing an artificial intelligence algorithm, and sends the video frames to the ground station; the local track and acceleration information calculated by the bee colony controller are received and sent to the flight controller, the unmanned aerial vehicle is controlled to autonomously cooperate with other unmanned aerial vehicles in the cluster to complete flight tasks.
Compared with the prior art, the beneficial effects of the distributed heterogeneous unmanned aerial vehicle cluster space integrated control method provided by the embodiment of the invention are the same as those of the distributed heterogeneous unmanned aerial vehicle cluster space integrated control system described in the technical scheme, and the detailed description is omitted here.
The foregoing is merely illustrative of the present invention, and the present invention is not limited thereto, and any person skilled in the art can easily think about variations or alternatives within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (8)

1. A distributed heterogeneous unmanned aerial vehicle cluster air-ground integrated control system,
characterized by comprising the following steps: a plurality of unmanned aerial vehicles and a ground station in communication connection with the plurality of unmanned aerial vehicles, wherein each unmanned aerial vehicle of the plurality of unmanned aerial vehicles comprises:
the bee group controller is used for calculating the local flight path and acceleration information according to the state information of other unmanned aerial vehicles in the unmanned aerial vehicle cluster and sending the local flight path and acceleration information to the information processing board;
the flight controller is used for controlling the gesture of the unmanned aerial vehicle and adjusting the speed and the flight direction of the unmanned aerial vehicle;
the photoelectric pod is used for shooting video frames in real time and tracking targets;
the information processing board is used for receiving video frames of the optical pod and the optical type load and detecting data of the detection type load, identifying a target in the video frames by utilizing an artificial intelligence algorithm and sending the video frames to the ground station; and receiving the local flight path and acceleration information calculated by the bee colony controller and sending the information to the flight controller so as to control the local unmanned aerial vehicle to complete the flight task in the autonomous cooperation cluster.
2. The distributed heterogeneous unmanned aerial vehicle cluster space integrated control system of claim 1, wherein the information processing board comprises:
the data access module is used for:
receiving telemetry data of a flight controller as a flight control information topic, and sending the flight control information topic to a ground station node so as to monitor unmanned plane state information through a ground station;
creating unmanned aerial vehicle nodes, and subscribing task input topics published by ground station nodes based on the unmanned aerial vehicle nodes so that all unmanned aerial vehicles in the cluster receive the task input topics;
the method comprises the steps that a bee group control topic is established based on unmanned aerial vehicle nodes, the bee group control topics of other unmanned aerial vehicles in a cluster are subscribed, the bee group control topics of the other unmanned aerial vehicles are sent to a bee group controller, and the bee group controller calculates local flight path and acceleration information according to the bee group control topics of the other unmanned aerial vehicles so as to control the local unmanned aerial vehicles to autonomously cooperate with the other unmanned aerial vehicles in the cluster to complete flight tasks.
3. A distributed heterogeneous unmanned aerial vehicle cluster space integrated control system according to claim 2, wherein the information processing board comprises:
the cooperative control module is used for: and activating the bee colony controller to perform initial track calculation according to the task input topics.
4. A distributed heterogeneous unmanned aerial vehicle cluster space integrated control system according to claim 2, wherein the information processing board comprises:
the machine carries perception module, machine carries perception module and is used for:
and identifying and tracking the target in the video frame based on an artificial intelligence identification model and a tracking algorithm, superposing an identification frame in the video frame to envelop the identified target, endowing the same target with the same target ID to the identified same target, encoding the video frame superposed with the identification frame and the endowed target ID, and pushing the video frame to a ground station.
5. A distributed heterogeneous unmanned aerial vehicle cluster space integrated control system according to claim 2, wherein the information processing board comprises:
the human-computer interaction module is used for:
the information processor identifies the target in the continuous video frames shot by the photoelectric pod, marks and sends the target to the ground station, an operator clicks the target on the ground station and tracks the target, and sends the locking position to the ground station, or the operator selects the identified target on the ground station frame, tracks the target selected by the ground station frame, and sends the locking position to the ground station.
6. The distributed heterogeneous unmanned aerial vehicle cluster space integrated control system of claim 1, wherein,
the unmanned aerial vehicles comprise unmanned aerial vehicles with various configurations, and a distributed control structure is formed among the unmanned aerial vehicles with the various configurations.
7. A distributed heterogeneous unmanned aerial vehicle cluster space integrated control system according to claim 2, wherein the ground station is configured to:
receiving real-time positions and tracks of the unmanned aerial vehicle, and displaying real-time state information of all unmanned aerial vehicles;
entering a task information input picture before the unmanned aerial vehicle flies, and setting task information in the task information input picture, wherein the task information comprises the number of unmanned aerial vehicles, cruising areas and no-fly areas, and the task information is sent to a bee colony controller through a data link through an information processing board to carry out task decision;
during the investigation, the target is locked and the photoelectric pod is controlled to track the target or the tracking and locking of the photoelectric pod to the target are cancelled.
8. The method for integrally controlling the air-ground of the distributed heterogeneous unmanned aerial vehicle cluster is characterized by comprising the following steps of:
step S1: the bee group controller calculates the information of the local flight path and the acceleration according to the state information of other unmanned aerial vehicles in the unmanned aerial vehicle cluster, and sends the information of the local flight path and the acceleration to the information processing board;
step S2: the information processing board receives video frames of the optical pod and the optical type load and detection data of the detection type load, identifies a target in the video frames by utilizing an artificial intelligence algorithm, and sends the video frames to the ground station; and receiving the local flight path and acceleration information calculated by the bee colony controller and sending the information to the flight controller so as to control the local unmanned aerial vehicle to complete the flight task in the autonomous cooperation cluster.
CN202211477244.XA 2022-11-23 2022-11-23 Distributed heterogeneous unmanned aerial vehicle cluster air-ground integrated control system and method Pending CN116068928A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211477244.XA CN116068928A (en) 2022-11-23 2022-11-23 Distributed heterogeneous unmanned aerial vehicle cluster air-ground integrated control system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211477244.XA CN116068928A (en) 2022-11-23 2022-11-23 Distributed heterogeneous unmanned aerial vehicle cluster air-ground integrated control system and method

Publications (1)

Publication Number Publication Date
CN116068928A true CN116068928A (en) 2023-05-05

Family

ID=86170667

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211477244.XA Pending CN116068928A (en) 2022-11-23 2022-11-23 Distributed heterogeneous unmanned aerial vehicle cluster air-ground integrated control system and method

Country Status (1)

Country Link
CN (1) CN116068928A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117075633A (en) * 2023-08-31 2023-11-17 杭州中汇通航航空科技有限公司 Unmanned aerial vehicle actual combat application management platform

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117075633A (en) * 2023-08-31 2023-11-17 杭州中汇通航航空科技有限公司 Unmanned aerial vehicle actual combat application management platform
CN117075633B (en) * 2023-08-31 2024-02-09 杭州中汇通航航空科技有限公司 Unmanned aerial vehicle actual combat application management platform

Similar Documents

Publication Publication Date Title
CN109613931B (en) Heterogeneous unmanned aerial vehicle cluster target tracking system and method based on biological social force
Ryan et al. An overview of emerging results in cooperative UAV control
EP3557358B1 (en) Adaptive autonomy system architecture
Ahrens et al. Vision-based guidance and control of a hovering vehicle in unknown, GPS-denied environments
CN108762286A (en) A kind of ground control system for the control that can fly to multiple UAVs
CN105573330A (en) Aircraft control method based on intelligent terminal
Valenti et al. Autonomous quadrotor flight using onboard RGB-D visual odometry
CN113778132B (en) Integrated parallel control platform for sea-air collaborative heterogeneous unmanned system
CN109557880A (en) A kind of ecological cruising inspection system based on unmanned plane
CN113271357B (en) Ground-air cooperative networking system and control method
Guinand et al. A decentralized interactive architecture for aerial and ground mobile robots cooperation
CN114115289A (en) Autonomous unmanned cluster reconnaissance system
CN116068928A (en) Distributed heterogeneous unmanned aerial vehicle cluster air-ground integrated control system and method
Lin et al. Development of an unmanned coaxial rotorcraft for the DARPA UAVForge challenge
Skjervold Autonomous, cooperative uav operations using cots consumer drones and custom ground control station
Wheeler et al. Cooperative tracking of moving targets by a team of autonomous UAVs
CN114239305A (en) Battlefield situation scene simulation excitation system
Liang et al. Design and development of ground control system for tethered uav
Sanchez-Lopez et al. A vision based aerial robot solution for the mission 7 of the international aerial robotics competition
Tso et al. A multi-agent operator interface for unmanned aerial vehicles
CN113867393A (en) Flight path controllable unmanned aerial vehicle formation form reconstruction method
CN114897935B (en) Method and system for tracking aerial target object by unmanned aerial vehicle based on virtual camera
CN115237158A (en) Multi-rotor unmanned aerial vehicle autonomous tracking and landing control system and control method
CN114047786A (en) Cooperative processing system and method for distributed heterogeneous unmanned aerial vehicle cluster
Jones et al. Human-robot interaction for field operation of an autonomous helicopter

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination