CN114706427A - Sea-air stereoscopic collaborative searching system and control method thereof - Google Patents

Sea-air stereoscopic collaborative searching system and control method thereof Download PDF

Info

Publication number
CN114706427A
CN114706427A CN202210619317.8A CN202210619317A CN114706427A CN 114706427 A CN114706427 A CN 114706427A CN 202210619317 A CN202210619317 A CN 202210619317A CN 114706427 A CN114706427 A CN 114706427A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
module
target
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210619317.8A
Other languages
Chinese (zh)
Inventor
陈德山
黄俊杰
王冠宇
王琛鑫
张云飞
郑浩然
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University of Technology WUT
Original Assignee
Wuhan University of Technology WUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University of Technology WUT filed Critical Wuhan University of Technology WUT
Priority to CN202210619317.8A priority Critical patent/CN114706427A/en
Publication of CN114706427A publication Critical patent/CN114706427A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/104Simultaneous control of position or course in three dimensions specially adapted for aircraft involving a plurality of aircrafts, e.g. formation flying
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a sea-air stereoscopic collaborative searching system and a control method thereof. The system collects the environmental information of a target area through the unmanned aerial vehicle module, then generates a rescue scheme through the decision generation module according to the environmental information collected by the unmanned aerial vehicle module, then the control module controls the unmanned aerial vehicle module to carry out rescue operation according to the rescue scheme, a user can check the environmental information collected by the unmanned aerial vehicle module in real time through the input and output module, and the control module can be controlled through the input and output module. The system can realize the precise positioning of the search and rescue target in a long distance and realize the quick rescue of the search and rescue target through the WIG craft. The invention can be widely applied to the technical field of unmanned aerial vehicles.

Description

Sea-air stereoscopic collaborative searching system and control method thereof
Technical Field
The invention relates to the technical field of unmanned aerial vehicles, in particular to a sea-air stereoscopic collaborative searching system and a control method thereof.
Background
The unmanned aerial vehicle system is an intelligent motion platform and has been widely used in the fields of maritime search and rescue and the like. In the correlation technique, the existing unmanned aerial vehicle system has the following defects: on one hand, the existing unmanned aerial vehicle cannot adapt to completely unfamiliar environments or emergency situations, and meanwhile, the operator controls the unmanned aerial vehicle in a wireless mode and is limited by distance and environment, so that the observation range of the system is limited; on the other hand, the existing unmanned aerial vehicle system only has basic modules such as communication, control and decision, and cannot construct a corresponding reconnaissance network for a specific environment.
In view of the above, there is a need to solve the problems in the related art.
Disclosure of Invention
The present invention aims to solve at least to some extent one of the technical problems existing in the prior art.
In order to achieve the above object, the technical solution adopted by the embodiments of the present invention includes:
in one aspect, an embodiment of the present invention provides a system for collaborative search in a sea-sky space, including:
the unmanned aerial vehicle module is used for acquiring environmental information of a target area; the unmanned aerial vehicle module comprises an unmanned ground effect wing ship, an aerial unmanned aerial vehicle and an underwater vehicle, wherein the ground effect wing ship is used for carrying the aerial unmanned aerial vehicle and the underwater vehicle and carrying out rescue operation, the aerial unmanned aerial vehicle is used for acquiring aerial environment information of a target area, and the underwater vehicle is used for acquiring underwater or water surface environment information of the target area;
the decision generation module is used for generating a rescue scheme according to the environmental information acquired by the unmanned aerial vehicle module;
the control module is used for controlling the unmanned aerial vehicle module to carry out rescue operation according to the rescue scheme;
the input and output module is used for displaying the environmental information acquired by the unmanned aerial vehicle module in real time and acquiring control parameters; the control parameters are used for setting the working mode of the control module.
Further, the unmanned WIG craft comprises a radar, a visible light camera and an infrared camera; the unmanned aerial vehicle ship comprises a radar, a visible light camera and an infrared camera; the underwater vehicle comprises a radar, a visible light camera and an infrared camera;
the radar is used for acquiring a radar image of a target area;
the visible light camera is used for acquiring a binocular image of a target area;
the infrared camera is used for acquiring an infrared image of a target area.
Further, the system further comprises a context awareness module, the context awareness module comprising:
the target feature extraction unit is used for extracting target features of the radar image to obtain target feature information;
the weak and small target detection unit is used for detecting the weak and small target of the infrared image to obtain weak and small target information;
the target segmentation unit is used for carrying out target segmentation on the binocular image to obtain target segmentation information;
and the target positioning unit is used for carrying out information fusion and information understanding according to the target characteristic information, the small and weak target information and the target segmentation information to obtain a target position, a target size and a target number.
Further, the drone module also includes a navigation unit for controlling the unmanned WIG craft, the aerial drone, and the underwater vehicle to reach a target location.
Further, the aerial drone further comprises a search unit for searching for
Dividing the target area into cells, and calculating the probability that the cell area contains the search target;
and generating a corresponding search scheme according to the probability.
Further, the decision generation module comprises:
the data acquisition unit is used for acquiring the environmental information of the target area acquired by the unmanned aerial vehicle module;
a feature extraction unit for extracting a feature vector of the environmental information;
and the vector comparison unit is used for matching the characteristic vectors through a decision database and determining the rescue scheme corresponding to the characteristic vectors.
Further, the input-output module includes:
the interface display unit is used for displaying the environmental information acquired by the unmanned aerial vehicle module;
the data integration unit is used for performing data integration on the acquired environmental information;
a parameter modification unit for obtaining the control parameter.
On the other hand, an embodiment of the present invention provides a control method for a sea-air stereoscopic collaborative search system, which is executed by the sea-air stereoscopic collaborative search system, and includes the following steps:
measuring the current position of the unmanned aerial vehicle module in real time;
controlling, by the navigation unit, the drone module to reach the target area from the current location;
acquiring environmental information of the target area through the unmanned aerial vehicle module;
generating a corresponding rescue scheme through the decision generation module according to the environment information;
and controlling the unmanned aerial vehicle module to carry out rescue operation according to the rescue scheme.
Further, the step of measuring the current position of the drone module in real time specifically includes:
measuring, by a navigation unit, current positions of the unmanned WIG craft, the aerial drone, and the underwater vehicle;
calculating differential information through a mobile base station arranged on the unmanned WIG craft;
and updating the current positions of the aerial unmanned aerial vehicle and the underwater vehicle according to the current position and the differential information.
Further, the step of measuring the current position of the drone module in real time further includes:
determining that the distance between the WIG craft and the unmanned aerial vehicle is greater than a preset distance, and acquiring an ionosphere correction value of an area where the unmanned aerial vehicle is located, wherein the unmanned aerial vehicle comprises the aerial unmanned aerial vehicle and the underwater vehicle;
calculating differential information according to the ionosphere correction value;
and updating the current position of the aerial unmanned aerial vehicle according to the current position and the differential information.
The invention has the following beneficial effects:
the environmental information of target area is gathered through the unmanned aerial vehicle module to this embodiment, and the rethread decision-making generates the module and generates the rescue scheme according to the environmental information that the unmanned aerial vehicle module was gathered, and control module carries out the rescue operation according to rescue scheme control unmanned aerial vehicle module afterwards, and the user accessible input/output module looks over the environmental information that the unmanned aerial vehicle module was gathered in real time to can realize controlling control module through input/output module. According to the invention, on one hand, the WIG craft is taken as a mother craft, the signal base station, the unmanned aerial vehicle and the unmanned ship are carried to realize remote and accurate positioning search and rescue, the characteristics of high navigation speed and large carrying capacity of the WIG craft are utilized to realize rapid rescue of a fixed point target, on the other hand, the multisource image data complementation technology is applied, the unmanned equipment transmission signals are integrated and the synthesis processing is carried out under the same coordinate system, the target discrimination accuracy can be improved in a large range, the search efficiency and the recognition accuracy are effectively improved, and the manual intervention intensity is greatly reduced.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the following description is made on the drawings of the embodiments of the present invention or the related technical solutions in the prior art, and it should be understood that the drawings in the following description are only for convenience and clarity of describing some embodiments in the technical solutions of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a schematic diagram of an implementation environment of a three-dimensional collaborative search system in the air and sea provided in an embodiment of the present application;
fig. 2 is a block diagram of a sea-air stereo collaborative search system according to an embodiment of the present invention;
fig. 3 is a flowchart illustrating steps of a control method of a three-dimensional collaborative search system in the air and sea according to an embodiment of the present invention;
FIG. 4 is a flowchart illustrating an operation of an environment sensing module according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a navigation unit according to an embodiment of the present invention;
fig. 6 is a schematic diagram of an iterative cell search according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to the present preferred embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to like elements throughout.
In the description of the embodiments of the present invention, several means are one or more, a plurality of means is two or more, more than, less than, more than, etc. are understood as excluding the essential numbers, more than, less than, inner, etc. are understood as including the essential numbers, "at least one" means one or more, "at least one of the following" and the like, and any combination of these items, including any combination of a single item or plural items, is meant. If the description of "first", "second", etc. is used for the purpose of distinguishing technical features, it is not intended to indicate or imply relative importance or to implicitly indicate the number of indicated technical features or to implicitly indicate the precedence of the indicated technical features.
It should be noted that terms such as setting, installing, connecting and the like in the embodiments of the present invention should be understood in a broad sense, and a person skilled in the art may reasonably determine specific meanings of the terms in the embodiments of the present invention by combining specific contents of the technical solutions. For example, the term "coupled" may be mechanical, electrical, or may be in communication with each other; may be directly connected or indirectly connected through an intermediate.
In the description of embodiments of the present disclosure, reference to the description of the terms "one embodiment/implementation," "another embodiment/implementation," or "certain embodiments/implementations," "in the above embodiments/implementations," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least two embodiments or implementations of the present disclosure. In the present disclosure, a schematic representation of the above terms does not necessarily refer to the same exemplary embodiment or implementation. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or implementations.
It should be noted that the technical features related to the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
According to statistical data published by the China maritime search and rescue center, the success rate of maritime search and rescue in China can only be maintained at about 95 percent at present, and partial ships and personnel in danger can not be effectively rescued. In the process of marine rescue, how to quickly and accurately find out the target in distress in the water surface and underwater environment is a crucial step, is a key for solving the problem of 'invisible' in the process of rescue and salvage, and is a primary factor for restricting the improvement of the efficiency of marine search and rescue in China at present.
Therefore, the sea-air stereoscopic collaborative searching system and the control method thereof are provided, the environment information of a target area is collected through the unmanned aerial vehicle module, then the rescue scheme is generated through the decision generation module according to the environment information collected by the unmanned aerial vehicle module, then the control module controls the unmanned aerial vehicle module to carry out rescue operation according to the rescue scheme, a user can check the environment information collected by the unmanned aerial vehicle module in real time through the input and output module, and control over the control module can be achieved through the input and output module. According to the invention, on one hand, the WIV is used as a mother ship to carry a signal base station, an unmanned aerial vehicle and an unmanned ship to realize remote and accurate positioning search and rescue, and then the characteristics of high navigational speed and large carrying capacity of the WIV are utilized to realize rapid rescue of a fixed point target, on the other hand, a multi-source image data complementation technology is applied, and unmanned equipment transmission signals are integrated and synthesized under the same coordinate system, so that the target discrimination precision can be improved in a large range, the search efficiency and the recognition precision are effectively improved, and the manual intervention strength is greatly reduced.
Fig. 1 is a schematic diagram of an implementation environment of a three-dimensional collaborative search system in the air and sea according to an embodiment of the present disclosure. Referring to fig. 1, the software and hardware main body of the implementation environment mainly includes an operation terminal 101 and a server 102, and the operation terminal 101 is connected to the server 102 in a communication manner. The control method of the air-sea stereo collaborative search system may be separately configured to be executed by the operation terminal 101, may also be separately configured to be executed by the server 102, or may be executed based on the interaction between the operation terminal 101 and the server 102, which may be selected appropriately according to the actual application, and this embodiment is not limited in particular. In addition, the operation terminal 101 and the server 102 may be nodes in a block chain, which is not particularly limited in this embodiment.
Specifically, the operation terminal 101 in the present application may include, but is not limited to, any one or more of a smart watch, a smart phone, a computer, a Personal Digital Assistant (PDA), an intelligent voice interaction device, an intelligent household appliance, or a vehicle-mounted terminal. The server 102 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a Network service, cloud communication, a middleware service, a domain name service, a security service, a CDN (Content Delivery Network), a big data and artificial intelligence platform. The operation terminal 101 and the server 102 may establish a communication connection through a wireless Network or a wired Network, which uses standard communication technologies and/or protocols, and the Network may be set as the internet, or may be any other Network, such as, but not limited to, a Local Area Network (LAN), a Metropolitan Area Network (MAN), a Wide Area Network (WAN), a mobile, wired, or wireless Network, a private Network, or any combination of virtual private networks.
Fig. 2 is a block diagram of a system for collaborative search in a sea and air space according to an embodiment of the present application, where a control subject of the system may be at least one of an operation terminal or a server, and fig. 2 illustrates an example in which the operation terminal controls the system for collaborative search in a sea and air space. Referring to fig. 2, the air-sea stereoscopic collaborative search system includes:
the unmanned aerial vehicle module is used for acquiring environmental information of a target area; the unmanned aerial vehicle module comprises an unmanned ground effect wing ship, an aerial unmanned aerial vehicle and an underwater vehicle, wherein the ground effect wing ship is used for carrying the aerial unmanned aerial vehicle and the underwater vehicle and carrying out rescue operation, the aerial unmanned aerial vehicle is used for acquiring aerial environment information of a target area, and the underwater vehicle is used for acquiring underwater or water surface environment information of the target area;
the decision generation module is used for generating a rescue scheme according to the environmental information acquired by the unmanned aerial vehicle module;
the control module is used for controlling the unmanned aerial vehicle module to carry out rescue operation according to the rescue scheme;
the input and output module is used for displaying the environmental information acquired by the unmanned aerial vehicle module in real time and acquiring control parameters; the control parameters are used for setting the working mode of the control module.
The environmental information of target area is gathered through the unmanned aerial vehicle module to this embodiment, and the rethread decision-making generates the module and generates the rescue scheme according to the environmental information that the unmanned aerial vehicle module was gathered, and control module carries out the rescue operation according to rescue scheme control unmanned aerial vehicle module afterwards, and the user accessible input/output module looks over the environmental information that the unmanned aerial vehicle module was gathered in real time to can realize controlling control module through input/output module. According to the invention, on one hand, the WIV is used as a mother ship to carry a signal base station, an unmanned aerial vehicle and an unmanned ship to realize remote and accurate positioning search and rescue, and then the characteristics of high navigational speed and large carrying capacity of the WIV are utilized to realize rapid rescue of a fixed point target, on the other hand, a multi-source image data complementation technology is applied, and unmanned equipment transmission signals are integrated and synthesized under the same coordinate system, so that the target discrimination precision can be improved in a large range, the search efficiency and the recognition precision are effectively improved, and the manual intervention strength is greatly reduced.
Further as an alternative embodiment, the unmanned WIG craft includes a radar, a visible light camera, and an infrared camera; the unmanned aerial vehicle ship comprises a radar, a visible light camera and an infrared camera; the underwater vehicle comprises a radar, a visible light camera and an infrared camera;
the radar is used for acquiring a radar image of a target area;
the visible light camera is used for acquiring a binocular image of a target area;
the infrared camera is used for acquiring an infrared image of a target area.
Specifically, the unmanned ground effect craft, the aerial unmanned aerial vehicle craft and the underwater vehicle can be provided with one or more of a radar, a visible light camera and an infrared camera, and the unmanned aerial vehicle module can acquire information of a target area through the radar, the visible light camera and the infrared camera.
As a further optional implementation, the system further comprises a context awareness module, the context awareness module comprising:
the target feature extraction unit is used for extracting target features of the radar image to obtain target feature information;
the weak and small target detection unit is used for detecting the weak and small target of the infrared image to obtain weak and small target information;
the target segmentation unit is used for carrying out target segmentation on the binocular image to obtain target segmentation information;
and performing information fusion and information understanding according to the target characteristic information, the small target information and the target segmentation information to obtain a target position, a target size and a target number.
Specifically, referring to fig. 4, the hardware system of the environmental awareness module is composed of a vision computer, a photoelectric pod, a radar, and the like. The electro-optical pod includes a photo camera and an infrared camera. The radar, the visible light camera and the infrared camera collect image information in the marine environment in real time, different types of image data are fused to obtain front obstacle information, and the information is shared to the control module through the communication system.
As a further optional embodiment, the drone module further comprises a navigation unit for controlling the unmanned ground effect craft, the aerial drone and the underwater vehicle to reach a target location.
Specifically, referring to fig. 5, a single-navigator configuration is adopted in the navigation aspect, in the embodiment, the WIV is used as a single navigator, the collaborative navigation method based on the navigator using distance information minimizes the system complexity and the requirements on the measurement means, and the like, a defense matrix is not required to be deployed, the operation form and the activity area are flexible, and the method is suitable for a large number of application occasions, such as marine three-dimensional survey, large-scale search, and the like.
Exemplarily, the underwater vehicle can carry navigation equipment with different performances, and can also carry a dead reckoning unit, a depth meter, a hydrophone, a Micro-Modem and other equipment, so that the underwater vehicle can play a role in assisting a main pilot (a WIG craft) in searching in the rescue process, and meanwhile, the underwater vehicle also provides accurate positioning of people in danger for the main pilot during rescue.
Likewise, aerial drones also carry equipment of varying capabilities. After the formation of the aerial unmanned aerial vehicles is released, the formation of the aerial unmanned aerial vehicles starts to search according to a base point and a radius set by a program, the formation of the aerial unmanned aerial vehicles carries a laser radar, a camera, an infrared camera and the like respectively, all information is gathered to a main pilot, namely a ground effect wing ship after the information is searched by respective equipment, the analysis of the data is carried out on the ground effect wing ship, the positions of the persons in distress are locked in the air, then the environment information and the individual conditions of the persons in distress are locked according to the ship radar on the ground effect wing ship, the comprehensive information of various detection devices of the formation of the aerial unmanned aerial vehicles, and necessary and suitable rescue measures are taken after the autonomous analysis.
As a further optional implementation, the aerial drone further comprises a search unit for searching for a target object
Dividing the target area into cells, and calculating the probability that the cell area contains a search target;
and generating a corresponding search scheme according to the probability.
Specifically, referring to fig. 6, the aerial unmanned aerial vehicle selects an optimal search area by using a "cell iterative search" method, first performs cell division on a target area, and calculates a probability that the cell area contains a search target, where a in fig. 6 is a probability distribution map obtained by calculation, and then generates a corresponding search scheme according to the obtained probability distribution map, and b in fig. 6 is a search scheme circuit diagram generated according to the probability distribution map.
As a further optional implementation, the decision generating module includes:
the data acquisition unit is used for acquiring the environmental information of the target area acquired by the unmanned aerial vehicle module;
a feature extraction unit for extracting a feature vector of the environmental information;
and the vector comparison unit is used for matching the characteristic vectors through a decision database and determining the rescue scheme corresponding to the characteristic vectors.
Specifically, the decision generation module acquires environmental information of a target area acquired by the unmanned aerial vehicle module through the data acquisition unit, then extracts a feature vector of the environmental information through the feature extraction unit, and then the vector comparison unit matches the feature vector through the decision database to determine a rescue scheme corresponding to the feature vector.
The data acquisition unit, the feature extraction unit and the vector comparison unit can be realized by a machine learning model. Exemplarily, in the embodiment of the present application, rescue schemes under different conditions and environments are used as training data sets, and after the training data sets are obtained, the training data sets can be input into the initialized decision-making generation model for training. Specifically, after the data in the training data set is input into the initialized decision-making generation model, the recognition result output by the model, that is, the decision-making prediction result, can be obtained, and the accuracy of the prediction of the recognition model can be evaluated according to the decision-making prediction result, so that the parameters of the model are updated. For the decision-making model, the accuracy of the model prediction result can be measured by a Loss Function (Loss Function), which is defined on a single training data and is used for measuring the prediction error of a training data, specifically, the Loss value of the training data is determined by the label of the single training data and the prediction result of the model on the training data. In actual training, a training data set has many training data, so a Cost Function (Cost Function) is generally adopted to measure the overall error of the training data set, and the Cost Function is defined on the whole training data set and is used for calculating the average value of prediction errors of all the training data, so that the prediction effect of the model can be measured better. For a general machine learning model, based on the cost function, and a regularization term for measuring the complexity of the model, the regularization term can be used as a training objective function, and based on the objective function, the loss value of the whole training data set can be obtained. There are many kinds of commonly used loss functions, such as 0-1 loss function, square loss function, absolute loss function, logarithmic loss function, cross entropy loss function, etc. all can be used as the loss function of the machine learning model, and are not described one by one here. In the embodiment of the application, a loss function can be selected from the loss functions to determine the loss value of the training. And updating the parameters of the model by adopting a back propagation algorithm based on the trained loss value, and iterating for several rounds to obtain the trained decision generation model. Specifically, the number of iteration rounds may be preset, or training may be considered complete when the test set meets the accuracy requirement.
As a further optional implementation, the input/output module includes:
the interface display unit is used for displaying the environmental information acquired by the unmanned aerial vehicle module;
the data integration unit is used for performing data integration on the acquired environmental information;
a parameter modification unit for obtaining the control parameter.
Specifically, the input/output module is mainly used for: 1) interface display, mainly including the state information display of unmanned surface vehicle such as position, navigational speed and course, zooming, translation and rotation of the observation area; 2) the control related parameter function mainly comprises control right switching, automatic/manual switching, control parameter adjustment, navigation instruction issuing and equipment switching instruction; 3) the task control related instruction functions mainly comprise two-digit map generation and reading, navigation area instruction issuing, navigation reference track issuing and the like; 4) and post data processing functions including navigation data playback, target data playback and navigation/target data synchronization.
The above is a description of the system configuration of the embodiment of the present invention, and a control method of the embodiment of the present invention is described below.
Referring to fig. 3, an embodiment of the present invention provides a control method for a sea-air stereoscopic collaborative search system, which is executed by the sea-air stereoscopic collaborative search system, and includes the following steps:
s101, measuring the current position of the unmanned aerial vehicle module in real time;
s102, controlling the unmanned aerial vehicle module to reach the target area from the current position through the navigation unit;
s103, acquiring environment information of the target area through the unmanned aerial vehicle module;
s104, generating a corresponding rescue scheme through the decision generation module according to the environment information;
s105, controlling the unmanned aerial vehicle module to carry out rescue operation according to the rescue scheme.
Specifically, through measuring the current position of unmanned aerial vehicle module, reach the target area through the navigation unit control unmanned aerial vehicle module simultaneously for can acquire the environmental information of target area by the unmanned aerial vehicle module, then generate the rescue scheme that corresponds through decision-making generation module, according to the rescue scheme, control unmanned aerial vehicle module and rescue. On the one hand, the ground effect wing ship is used as a mother ship, a signal base station, an unmanned aerial vehicle and an unmanned ship are carried to realize remote and accurate positioning search and rescue, the characteristics of high navigational speed and large carrying capacity of the ground effect wing ship are recycled, rapid rescue of a fixed-point target is realized, on the other hand, a multi-source image data complementation technology is applied, unmanned equipment transmission signals are integrated and synthesis processing under the same coordinate system is carried out, target discrimination precision can be improved on a large scale, then search efficiency and recognition precision are effectively improved, and manual intervention strength is greatly reduced.
As a further optional implementation manner, the step of measuring the current position of the drone module in real time specifically includes:
measuring, by a navigation unit, current positions of the unmanned WIG craft, the aerial drone, and the underwater vehicle;
calculating differential information through a mobile base station arranged on the unmanned WIG craft;
and updating the current positions of the aerial unmanned aerial vehicle and the underwater vehicle according to the current position and the differential information.
As a further optional implementation manner, the step of measuring the current position of the drone module in real time further includes:
determining that the distance between the WIG craft and the unmanned aerial vehicle is greater than a preset distance, and acquiring an ionosphere correction value of an area where the unmanned aerial vehicle is located, wherein the unmanned aerial vehicle comprises the aerial unmanned aerial vehicle and the underwater vehicle;
calculating differential information according to the ionosphere correction value;
and updating the current position of the aerial unmanned aerial vehicle according to the current position and the differential information.
Specifically, the cloud server is used for calculating the absolute positions of the unmanned aerial vehicle and the WIG craft, the absolute positions can be broadcast to the unmanned aerial vehicle and the WIG craft in time, the relative positions can be calculated by the unmanned aerial vehicle and the WIG craft, and collaborative navigation is achieved; further, the ionosphere delay correction value of the unmanned aerial vehicle region can be calculated by the cloud server and sent to the WIG craft, the WIG craft can quickly realize long baseline rtk differential information calculation by using the correction value, and the calculation result is broadcasted to the unmanned aerial vehicle, so that the cooperative positioning precision is improved. Therefore, the co-location precision of the ground effect wing ship of the unmanned aerial vehicle can be effectively improved, and compared with the traditional method, the method has better calculation efficiency and provides technical support for the application of the related field. The method comprises the following steps:
calculating the absolute positions of the unmanned aerial vehicle and the WIG craft by using the satellite navigation observed quantity, and uploading the absolute positions to a cloud server;
calculating differential information by using a mobile rtk base station of the WIG craft and broadcasting the differential information to a cloud server through a radio station;
the cloud server updates the position of the unmanned aerial vehicle according to the difference information and the prior absolute position and broadcasts the position to the unmanned aerial vehicle;
the unmanned aerial vehicle calculates the relative position between the unmanned aerial vehicle and the surrounding unmanned aerial vehicles by using the broadcasted position information, and updates the relative position;
when the distance between the WIG craft and the unmanned aerial vehicle exceeds the range of the short baseline rtk, switching to a long baseline working mode, and calculating an ionosphere correction value of the unmanned aerial vehicle area by the cloud server and broadcasting the ionosphere correction value to the WIG craft;
the WIG ship utilizes the ionosphere correction value, simultaneously adopts the observation quantity without geometric combination to quickly calculate rtk differential information, and broadcasts the information to a cloud server through a radio station; the cloud server broadcasts the difference information to the unmanned aerial vehicles, and the unmanned aerial vehicles further calculate the relative positions of the unmanned aerial vehicles and the surrounding unmanned aerial vehicles by utilizing the broadcast information;
when the ionosphere correction value of the cloud server is not received, the WIG craft adopts the ionosphere-free combination to observe and calculate rtk differential information, and sends the information to the cloud server through the radio station; the cloud server broadcasts the difference information to the unmanned aerial vehicle, and the unmanned aerial vehicle updates the relative position by using the difference information;
when the difference information is not received, the unmanned aerial vehicle extrapolates and estimates the position and the relative position at the next moment by using the existing position information, and after the difference information is received, the position and the relative position at the next moment are updated by using the difference information.
It can be understood that, compared with the prior art, the embodiment of the present invention also has the following advantages:
1) the ground effect wing ship is used as a mother ship, a signal base station, an unmanned aerial vehicle and an unmanned ship are carried to realize remote and accurate positioning search and rescue, and the characteristics of high navigational speed and large carrying capacity of the ground effect wing ship are utilized to realize rapid rescue of a fixed point target.
2) The invention has functional innovation in the aspect of how to quickly and accurately find out the distress target in the water surface and underwater environment, and the 3D sea-air integrated intelligent multi-layer search platform system which is expected to be developed can effectively improve the search and rescue range of the WIG craft and shorten the search and rescue time.
3) The invention is expected to apply the multi-source image data complementation technology, synthesize the unmanned equipment transmission signals and perform synthesis processing under the same coordinate system, and can improve the underwater target discrimination precision in a large range, thereby effectively improving the search efficiency and the recognition precision and greatly reducing the manual intervention strength.
While the preferred embodiments of the present invention have been illustrated and described, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. A three-dimensional collaborative search system in sea and air is characterized by comprising:
the unmanned aerial vehicle module is used for acquiring environmental information of a target area; the unmanned aerial vehicle module comprises an unmanned ground effect wing ship, an aerial unmanned aerial vehicle and an underwater vehicle, wherein the ground effect wing ship is used for carrying the aerial unmanned aerial vehicle and the underwater vehicle and carrying out rescue operation, the aerial unmanned aerial vehicle is used for acquiring aerial environment information of a target area, and the underwater vehicle is used for acquiring underwater or water surface environment information of the target area;
the decision generation module is used for generating a rescue scheme according to the environmental information acquired by the unmanned aerial vehicle module;
the control module is used for controlling the unmanned aerial vehicle module to carry out rescue operation according to the rescue scheme;
the input and output module is used for displaying the environmental information acquired by the unmanned aerial vehicle module in real time and acquiring control parameters; the control parameters are used for setting the working mode of the control module.
2. The sky-sea stereo cooperative search system according to claim 1, wherein the unmanned WIG craft comprises a radar, a visible light camera, and an infrared camera; the unmanned aerial vehicle ship comprises a radar, a visible light camera and an infrared camera; the underwater vehicle comprises a radar, a visible light camera and an infrared camera;
the radar is used for acquiring a radar image of a target area;
the visible light camera is used for acquiring a binocular image of a target area;
the infrared camera is used for acquiring an infrared image of a target area.
3. The air-sea stereoscopic collaborative search system according to claim 2, further comprising an environment awareness module, the environment awareness module comprising:
the target feature extraction unit is used for extracting target features of the radar image to obtain target feature information;
the weak and small target detection unit is used for detecting the weak and small target of the infrared image to obtain weak and small target information;
the target segmentation unit is used for carrying out target segmentation on the binocular image to obtain target segmentation information;
and the target positioning unit is used for carrying out information fusion and information understanding according to the target characteristic information, the small and weak target information and the target segmentation information to obtain a target position, a target size and a target number.
4. The sky-sea cooperative search system of claim 1, wherein the drone module further comprises a navigation unit for controlling the unmanned WIG craft, the aerial drone, and the underwater vehicle to reach a target location.
5. The air-sea stereoscopic collaborative search system according to claim 1, wherein the aerial unmanned aerial vehicle further comprises a search unit for searching
Dividing the target area into cells, and calculating the probability that the cell area contains the search target;
and generating a corresponding search scheme according to the probability.
6. The collaborative sea-air stereo search system according to claim 1, wherein the decision generation module includes:
the data acquisition unit is used for acquiring the environmental information of the target area acquired by the unmanned aerial vehicle module;
a feature extraction unit for extracting a feature vector of the environmental information;
and the vector comparison unit is used for matching the characteristic vectors through a decision database and determining the rescue scheme corresponding to the characteristic vectors.
7. The system for sea-air stereo cooperative search according to claim 1, wherein the input/output module comprises:
the interface display unit is used for displaying the environmental information acquired by the unmanned aerial vehicle module;
the data integration unit is used for performing data integration on the acquired environmental information;
a parameter modification unit for obtaining the control parameter.
8. A control method of a sea and air stereo cooperative search system, which is executed by the sea and air stereo cooperative search system according to any one of claims 1 to 7, comprising the following steps:
measuring the current position of the unmanned aerial vehicle module in real time;
controlling the unmanned aerial vehicle module to reach a target area from a current position through a navigation unit;
acquiring environmental information of the target area through the unmanned aerial vehicle module;
generating a corresponding rescue scheme through the decision generation module according to the environment information;
and controlling the unmanned aerial vehicle module to carry out rescue operation according to the rescue scheme.
9. The control method according to claim 8, wherein the step of measuring the current position of the drone module in real time specifically comprises:
measuring, by a navigation unit, current positions of the unmanned WIG craft, the aerial drone, and the underwater vehicle;
calculating differential information through a mobile base station arranged on the unmanned WIG craft;
and updating the current positions of the aerial unmanned aerial vehicle and the underwater vehicle according to the current position and the differential information.
10. The control method of claim 8, wherein the step of measuring the current position of the drone module in real time further comprises:
determining that the distance between the WIG craft and the unmanned aerial vehicle is greater than a preset distance, and acquiring an ionosphere correction value of an area where the unmanned aerial vehicle is located, wherein the unmanned aerial vehicle comprises the aerial unmanned aerial vehicle and the underwater vehicle;
calculating differential information according to the ionosphere correction value;
and updating the current position of the aerial unmanned aerial vehicle according to the current position and the differential information.
CN202210619317.8A 2022-06-02 2022-06-02 Sea-air stereoscopic collaborative searching system and control method thereof Pending CN114706427A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210619317.8A CN114706427A (en) 2022-06-02 2022-06-02 Sea-air stereoscopic collaborative searching system and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210619317.8A CN114706427A (en) 2022-06-02 2022-06-02 Sea-air stereoscopic collaborative searching system and control method thereof

Publications (1)

Publication Number Publication Date
CN114706427A true CN114706427A (en) 2022-07-05

Family

ID=82177722

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210619317.8A Pending CN114706427A (en) 2022-06-02 2022-06-02 Sea-air stereoscopic collaborative searching system and control method thereof

Country Status (1)

Country Link
CN (1) CN114706427A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117376934A (en) * 2023-12-08 2024-01-09 山东科技大学 Deep reinforcement learning-based multi-unmanned aerial vehicle offshore mobile base station deployment method

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108563242A (en) * 2018-03-30 2018-09-21 武汉理工大学 A kind of air-sea, which independently cooperates with, searches and rescues M3U platforms
CN108945343A (en) * 2018-05-30 2018-12-07 佛山市神风航空科技有限公司 A kind of rescue at sea system
CN208621968U (en) * 2018-06-22 2019-03-19 西安特种飞行器工程研究院有限公司 A kind of marine eco-environment cruising inspection system and underwater unmanned vehicle
CN109743096A (en) * 2018-12-24 2019-05-10 哈尔滨工程大学 A kind of UUV radio communication method based on unmanned plane
CN110058613A (en) * 2019-05-13 2019-07-26 大连海事大学 Multi-unmanned-aerial-vehicle multi-ant-colony collaborative target searching method
CN110389595A (en) * 2019-06-17 2019-10-29 中国工程物理研究院电子工程研究所 The unmanned plane cluster of double-attribute probability graph optimization cooperates with Target Searching Method
CN111176334A (en) * 2020-01-16 2020-05-19 浙江大学 Multi-unmanned aerial vehicle cooperative target searching method
CN111308523A (en) * 2020-03-31 2020-06-19 北京航空航天大学 Unmanned aerial vehicle unmanned ship collaborative navigation method
CN111487986A (en) * 2020-05-15 2020-08-04 中国海洋大学 Underwater robot cooperative target searching method based on global information transfer mechanism
CN111703559A (en) * 2020-06-23 2020-09-25 哈尔滨工程大学 ROV detection system and detection method for searching underwater missing person
CN111731453A (en) * 2020-07-08 2020-10-02 海南大学 Rescue method and rescue system for life-saving unmanned ship based on carrying unmanned aerial vehicle
CN111830981A (en) * 2020-07-15 2020-10-27 武汉理工大学 Maritime rescue-oriented unmanned three-dimensional collaborative search and rescue platform
CN112124489A (en) * 2020-09-03 2020-12-25 武汉理工大学 Unmanned ground effect wing ship based on folding wings
CN212935938U (en) * 2020-04-14 2021-04-09 上海溪莲海洋工程技术有限公司 Water area emergency rescue command system
CN113778132A (en) * 2021-09-26 2021-12-10 大连海事大学 Integrated parallel control platform for sea-air collaborative heterogeneous unmanned system

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108563242A (en) * 2018-03-30 2018-09-21 武汉理工大学 A kind of air-sea, which independently cooperates with, searches and rescues M3U platforms
CN108945343A (en) * 2018-05-30 2018-12-07 佛山市神风航空科技有限公司 A kind of rescue at sea system
CN208621968U (en) * 2018-06-22 2019-03-19 西安特种飞行器工程研究院有限公司 A kind of marine eco-environment cruising inspection system and underwater unmanned vehicle
CN109743096A (en) * 2018-12-24 2019-05-10 哈尔滨工程大学 A kind of UUV radio communication method based on unmanned plane
CN110058613A (en) * 2019-05-13 2019-07-26 大连海事大学 Multi-unmanned-aerial-vehicle multi-ant-colony collaborative target searching method
CN110389595A (en) * 2019-06-17 2019-10-29 中国工程物理研究院电子工程研究所 The unmanned plane cluster of double-attribute probability graph optimization cooperates with Target Searching Method
CN111176334A (en) * 2020-01-16 2020-05-19 浙江大学 Multi-unmanned aerial vehicle cooperative target searching method
CN111308523A (en) * 2020-03-31 2020-06-19 北京航空航天大学 Unmanned aerial vehicle unmanned ship collaborative navigation method
CN212935938U (en) * 2020-04-14 2021-04-09 上海溪莲海洋工程技术有限公司 Water area emergency rescue command system
CN111487986A (en) * 2020-05-15 2020-08-04 中国海洋大学 Underwater robot cooperative target searching method based on global information transfer mechanism
CN111703559A (en) * 2020-06-23 2020-09-25 哈尔滨工程大学 ROV detection system and detection method for searching underwater missing person
CN111731453A (en) * 2020-07-08 2020-10-02 海南大学 Rescue method and rescue system for life-saving unmanned ship based on carrying unmanned aerial vehicle
CN111830981A (en) * 2020-07-15 2020-10-27 武汉理工大学 Maritime rescue-oriented unmanned three-dimensional collaborative search and rescue platform
CN112124489A (en) * 2020-09-03 2020-12-25 武汉理工大学 Unmanned ground effect wing ship based on folding wings
CN113778132A (en) * 2021-09-26 2021-12-10 大连海事大学 Integrated parallel control platform for sea-air collaborative heterogeneous unmanned system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
夏征农 等编著: "《上海辞书出版社》", 31 July 2007 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117376934A (en) * 2023-12-08 2024-01-09 山东科技大学 Deep reinforcement learning-based multi-unmanned aerial vehicle offshore mobile base station deployment method
CN117376934B (en) * 2023-12-08 2024-02-27 山东科技大学 Deep reinforcement learning-based multi-unmanned aerial vehicle offshore mobile base station deployment method

Similar Documents

Publication Publication Date Title
US11373492B2 (en) Intelligent evacuation system and method used in subway station fire
US10580162B2 (en) Method for determining the pose of a camera and for recognizing an object of a real environment
CN105203084B (en) A kind of unmanned plane 3D panoramic vision devices
KR101793509B1 (en) Remote observation method and system by calculating automatic route of unmanned aerial vehicle for monitoring crops
KR100884100B1 (en) System and method for detecting vegetation canopy using airborne laser surveying
CN108139758A (en) Apparatus of transport positioning based on significant characteristics
CN112887899B (en) Positioning system and positioning method based on single base station soft position information
CN108061572B (en) Comprehensive situation display and control system and method for marine nuclear power platform
CN115421158B (en) Self-supervision learning solid-state laser radar three-dimensional semantic mapping method and device
Domozi et al. Real time object detection for aerial search and rescue missions for missing persons
CN112947587A (en) Intelligent unmanned ship search and rescue system and method
CN114706427A (en) Sea-air stereoscopic collaborative searching system and control method thereof
CN114923477A (en) Multi-dimensional space-ground collaborative map building system and method based on vision and laser SLAM technology
Yapar et al. Locunet: Fast urban positioning using radio maps and deep learning
CN111652276B (en) All-weather portable multifunctional bionic positioning and attitude-determining viewing system and method
CN117685953A (en) UWB and vision fusion positioning method and system for multi-unmanned aerial vehicle co-positioning
CN112235041A (en) Real-time point cloud processing system and method and airborne data acquisition device and method
KR20160099336A (en) Mobile mapping system
Abdalla et al. Geospatial data integration
CN103777196A (en) Ground target distance single station measurement method based on geographic information and measurement system thereof
CN114266830B (en) Underground large space high-precision positioning method
CN112859907A (en) Rocket debris high-altitude detection method based on three-dimensional special effect simulation under condition of few samples
CN112379395A (en) Positioning navigation time service system
Wu et al. Derivation of Geometrically and Semantically Annotated UAV Datasets at Large Scales from 3D City Models
CN116817892B (en) Cloud integrated unmanned aerial vehicle route positioning method and system based on visual semantic map

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20220705

RJ01 Rejection of invention patent application after publication