CN116110255A - Ship berthing collision early warning method, system and storage medium - Google Patents

Ship berthing collision early warning method, system and storage medium Download PDF

Info

Publication number
CN116110255A
CN116110255A CN202310093766.8A CN202310093766A CN116110255A CN 116110255 A CN116110255 A CN 116110255A CN 202310093766 A CN202310093766 A CN 202310093766A CN 116110255 A CN116110255 A CN 116110255A
Authority
CN
China
Prior art keywords
ship
data
collision
early warning
track
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310093766.8A
Other languages
Chinese (zh)
Inventor
万程鹏
敖新培
曹高杰
范亮
张笛
李聚珍
明星月
徐紫东
袁居鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Inland River Port And Shipping Industry Research Co ltd
Wuhan University of Technology WUT
Original Assignee
Guangdong Inland River Port And Shipping Industry Research Co ltd
Wuhan University of Technology WUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Inland River Port And Shipping Industry Research Co ltd, Wuhan University of Technology WUT filed Critical Guangdong Inland River Port And Shipping Industry Research Co ltd
Priority to CN202310093766.8A priority Critical patent/CN116110255A/en
Publication of CN116110255A publication Critical patent/CN116110255A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G3/00Traffic control systems for marine craft
    • G08G3/02Anti-collision systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A30/00Adapting or protecting infrastructure or their operation
    • Y02A30/30Adapting or protecting infrastructure or their operation in transportation, e.g. on roads, waterways or railways

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Databases & Information Systems (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Ocean & Marine Engineering (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Remote Sensing (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Traffic Control Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a ship berthing collision early warning method, a system and a storage medium, which are applied to the technical field of ship berthing, can accurately pre-judge the ship movement direction, realize early warning and monitoring of ship berthing, and reduce the occurrence of ship berthing collision accidents. The method comprises the following steps: acquiring data of an automatic ship identification system, and preprocessing to obtain ship preprocessing data; predicting a navigation speed predicted value and a heading predicted value of the ship through a preset neural network according to the ship pretreatment data; inputting the navigational speed predicted value and the heading predicted value into a ship track dynamics equation model to solve to obtain ship track data; inputting the ship track data into a preset iteration block for iteration to obtain a ship coordinate prediction sequence; according to the ship radar data and the video monitoring data, video target image data are obtained through fusion; fusing the ship coordinate prediction sequence and the video target image data to obtain a ship prediction track image; and carrying out ship collision early warning according to the ship predicted track image to obtain collision early warning information.

Description

Ship berthing collision early warning method, system and storage medium
Technical Field
The invention relates to the technical field of ship berthing, in particular to a ship berthing collision early warning method, a system and a storage medium.
Background
With the continuous increase of global sea traffic and the development of large and high-speed ships, the navigation density of the navigation channel is increased increasingly, and the accident of sea damage is frequent, so that the loading and unloading efficiency of a wharf is seriously influenced, and certain economic loss is caused. In the berthing process of the ship, the observation of an operator is influenced by a high-sized ship body, the superstructure or the ship board of the ship is easy to collide with on-shore loading and unloading equipment, and meanwhile, the huge inertia of the ship body with ultra-large weight can bring ship steering time delay. In the related art, due to inaccuracy of the operator for grasping the ship state and the surrounding space orientation, the ship itself and the on-shore equipment are easy to collide, and the ship berthing collision accident is frequent.
Disclosure of Invention
In order to solve at least one of the technical problems, the invention provides a ship berthing collision early warning method, a system and a storage medium, which can accurately pre-judge the ship movement, realize early warning and monitoring of ship berthing and effectively reduce the occurrence of ship berthing collision accidents.
In one aspect, the embodiment of the invention provides a ship berthing collision early warning method, which comprises the following steps:
acquiring data of an automatic ship identification system;
preprocessing the data of the automatic ship identification system to obtain ship preprocessing data;
Predicting a predicted speed value and a predicted heading value of the ship through a preset neural network according to the ship pretreatment data;
inputting the navigational speed predicted value and the heading predicted value into a ship track dynamics equation model for solving to obtain ship track data;
inputting the ship track data into a preset iteration block for iteration to obtain a ship coordinate prediction sequence; the preset iteration block is constructed according to the ship track dynamics equation model and the double hidden layer radial basis function network model;
fusing the ship radar data and the video monitoring data to obtain video target image data;
fusing the ship coordinate prediction sequence and the video target image data to obtain a ship prediction track image;
and carrying out ship collision early warning according to the ship prediction track image to obtain collision early warning information.
The ship berthing collision early warning method provided by the embodiment of the invention has at least the following beneficial effects: in the embodiment, first, the data of the automatic ship identification system is obtained, and the data of the automatic ship identification system is preprocessed to obtain the preprocessed data of the ship. Then, according to the ship preprocessing data, the predicted speed and heading of the ship are predicted through a preset neural network. Then, the embodiment inputs the navigational speed predicted value and the heading predicted value into a ship track dynamics equation model to solve to obtain ship track data, and constructs a preset iteration block according to the ship dynamics equation model and the double hidden layer radial basis function network model, so as to obtain a ship coordinate predicted sequence in an iteration mode of inputting the ship track data into the preset iteration block. According to the method, the ship track and the ship coordinates are predicted, so that the ship motion direction can be accurately predicted, the time for filling can be reserved for adjusting the ship, and the problem of collision of berthing of the ship is solved. Further, according to the method, the video target image data are obtained by fusing the ship radar data and the video monitoring data, and the ship predicted track image is obtained by fusing the ship coordinate prediction sequence and the video target image data, so that the ship collision early warning is carried out according to the bar predicted track image, the collision early warning information is obtained, the early warning and monitoring of the ship berthing are realized, and the occurrence of ship berthing collision accidents is effectively reduced. Meanwhile, the ship radar data and the video monitoring data are fused, so that the position movement direction of the ship can be effectively identified, and the probability of collision accidents in the berthing process of the ship is greatly reduced.
According to some embodiments of the invention, the preprocessing the automatic ship identification system data includes:
screening according to the data of the automatic ship identification system to obtain abnormal data; wherein the abnormal data includes error data and missing data;
classifying the abnormal data to obtain a data classification result;
performing a rejection operation on the error data according to the data classification result;
and repairing the missing data according to the data classification result.
According to some embodiments of the invention, the predetermined neural network comprises a convolutional layer, a pooling layer, and a fully-connected layer;
the predicting, according to the ship pretreatment data, the predicted speed and heading of the ship through a preset neural network includes:
inputting the ship pretreatment data into the convolution layer for first treatment to obtain local area characteristic data;
inputting the local region characteristic data into the pooling layer for second processing to obtain characteristic mapping data;
inputting the feature mapping data into the full connection layer for third processing to obtain the navigational speed predicted value and the heading predicted value; wherein the third process includes concatenation, combination, and operation.
According to some embodiments of the present invention, the fusing of the radar data and the video surveillance data of the ship to obtain video target image data includes:
calculating to obtain ship space data through Euclidean distance algorithm and clustering algorithm according to the ship radar data;
identifying the video monitoring data through a preset target detection algorithm to obtain a ship target;
obtaining ship position data according to the ship target through the mapping relation between the video image space and the geographic space;
and fusing the ship space data and the ship position data to obtain the video target image data.
According to some embodiments of the present invention, the calculating the ship space data according to the ship radar data through the euclidean distance algorithm and the clustering algorithm includes:
constructing a point cloud cluster through a tree data structure according to the ship radar data;
dividing the point cloud clusters according to a Euclidean distance algorithm to obtain subset cloud clusters;
calculating the distance between each cloud point of the first cloud cluster in the subset Yun Cu and a preset cloud point to obtain point distance data; the preset cloud points comprise the cloud points in the first cloud cluster and the cloud points in a second cloud cluster adjacent to the first cloud cluster;
Performing cluster analysis on the cloud points in the subset cloud clusters according to the point distance data to obtain a clustering result;
estimating the geometric center of the clustering result to obtain geometric center data;
and calculating the ship space data of the clustering result through the geometric center data.
According to some embodiments of the invention, the fusing the ship space data and the ship position data to obtain the video target image data includes:
performing time synchronization operation on the ship space data and the ship position data to obtain time synchronization data;
performing coordinate system transformation operation according to the time synchronization data to obtain preset coordinate synchronization data; the preset coordinate synchronization data comprise ship space coordinate synchronization data and ship position coordinate synchronization data;
and carrying out association matching according to the ship space coordinate synchronous data and the ship position coordinate synchronous data to obtain the video target image data.
According to some embodiments of the present invention, the performing a ship collision pre-warning according to the ship predicted trajectory image to obtain collision pre-warning information includes:
Constructing a ship collision early warning model according to the collision time;
and inputting the ship predicted track image into the ship collision early warning model for prediction to obtain the collision early warning information.
On the other hand, the embodiment of the invention also provides a ship berthing collision early warning system, which comprises:
the acquisition module is used for acquiring the data of the automatic ship identification system;
the pretreatment module is used for carrying out pretreatment on the ship automatic identification system data to obtain ship pretreatment data;
the prediction module is used for obtaining a predicted navigational speed value and a predicted navigational heading value of the ship through a preset neural network according to the ship pretreatment data;
the calculation module is used for inputting the navigational speed predicted value and the heading predicted value into a ship track dynamics equation model to be solved, so as to obtain ship track data;
the iteration module is used for inputting the ship track data into a preset iteration block for iteration to obtain a ship coordinate prediction sequence; the preset iteration block is constructed according to the ship track dynamics equation model and the double hidden layer radial basis function network model;
the first fusion module is used for fusing the ship radar data and the video monitoring data to obtain video target image data;
The second fusion module is used for fusing the ship coordinate prediction sequence and the video target image data to obtain a ship prediction track image;
and the early warning module is used for carrying out ship collision early warning according to the ship prediction track image to obtain collision early warning information.
On the other hand, the embodiment of the invention also provides a ship berthing collision early warning system, which comprises:
at least one processor;
at least one memory for storing at least one program;
when the at least one program is executed by the at least one processor, the at least one processor is caused to implement the ship berthing collision warning method according to the above embodiment.
In another aspect, an embodiment of the present invention further provides a computer storage medium, in which a program executable by a processor is stored, where the program executable by the processor is used to implement the ship berthing collision early warning method according to the above embodiment.
Drawings
FIG. 1 is a flow chart of a ship berthing collision early warning method provided by an embodiment of the invention;
FIG. 2 is a schematic block diagram of a ship berthing collision early warning system provided by an embodiment of the invention;
fig. 3 is a schematic block diagram of a preset iteration block provided in an embodiment of the present invention.
Detailed Description
The embodiments described in the present application should not be construed as limitations on the present application, but rather as many other embodiments as possible without inventive faculty to those skilled in the art, are intended to be within the scope of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is to be understood that "some embodiments" can be the same subset or different subsets of all possible embodiments and can be combined with one another without conflict.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application only and is not intended to be limiting of the present application.
Before describing embodiments of the present application, related terms referred to in the present application will be first described.
Automatic ship identification system (Automatic Identification System, abbreviated as AIS system): the system consists of shore-based (base station) facilities and shipborne equipment, and is a novel digital navigation aid system and equipment integrating network technology, modern communication technology, computer technology and electronic information display technology.
Radial basis function network (Radial basis function network, abbreviation: RBF network): is an artificial neural network that uses radial basis functions as activation functions. The output of the radial basis function network is a linear combination of the input radial basis function and the neuron parameters. Radial basis function networks have a variety of uses including function approximation, time series prediction, classification, and system control.
With the continuous increase of global sea traffic and the development of large and high-speed ships, the navigation density of the navigation channel is increased increasingly, and the accident of sea damage is frequent, so that the loading and unloading efficiency of a wharf is seriously influenced, and certain economic loss is caused. In the berthing process of the ship, the observation of an operator is influenced by a high-sized ship body, the superstructure or the ship board of the ship is easy to collide with on-shore loading and unloading equipment, and meanwhile, the huge inertia of the ship body with ultra-large weight can bring ship steering time delay. In the related art, due to inaccuracy of the operator for grasping the ship state and the surrounding space orientation, the ship itself and the on-shore equipment are easy to collide, and the ship berthing collision accident is frequent.
The embodiment of the invention provides a ship berthing collision early warning method, a system and a storage medium, which can accurately pre-judge the ship movement direction, realize early warning and monitoring of ship berthing and effectively reduce the occurrence of ship berthing collision accidents. Referring to fig. 1, the method of the embodiment of the present invention includes, but is not limited to, step S110, step S120, step S130, step S140, step S150, step S160, step S170, and step S180.
Specifically, the method application process of the embodiment of the invention includes, but is not limited to, the following steps:
s110: and acquiring the data of the automatic ship identification system.
S120: preprocessing the data of the automatic ship identification system to obtain ship preprocessing data.
S130: and predicting a navigational speed predicted value and a heading predicted value of the ship through a preset neural network according to the ship pretreatment data.
S140: and inputting the navigational speed predicted value and the heading predicted value into a ship track dynamics equation model to solve, so as to obtain ship track data.
S150: and inputting the ship track data into a preset iteration block for iteration to obtain a ship coordinate prediction sequence. And constructing a preset iteration block according to the ship track dynamics equation model and the double hidden layer radial basis function network model.
S160: and fusing the ship radar data and the video monitoring data to obtain video target image data.
S170: and fusing the ship coordinate prediction sequence and the video target image data to obtain a ship prediction track image.
S180: and carrying out ship collision early warning according to the ship predicted track image to obtain collision early warning information.
In the working process of the specific embodiment, the embodiment firstly obtains the data of the automatic ship identification system. Specifically, in this embodiment, a ship target is automatically identified by a ship Automatic Identification (AIS) system, and ship AIS data, that is, ship automatic identification system data, is obtained. The data of the ship automatic identification system in the embodiment comprises ship dynamic information and static information, wherein the dynamic information comprises longitude, latitude, speed to ground and heading to ground of the ship, and the static information comprises identification codes, water mobile communication service identification codes, calling numbers, ship names, ship captain, ship width, tonnage and ship type. Meanwhile, in the embodiment, the track prediction is performed on the original AIS statement received from the AIS receiver and forwarded by the serial port server. In the embodiment, the data of the automatic ship identification system is preprocessed to obtain ship preprocessing data. For example, damage data or error data may exist in the acquired ship automatic identification system data, and in this embodiment, the reliability of the data is improved by preprocessing the ship AIS data. Then, according to the ship preprocessing data, the predicted speed and heading of the ship are predicted through a preset neural network. In the embodiment, the ship preprocessing data are input into a preset neural network to predict the ship speed and the ship course at the next moment, so that corresponding predicted speed and predicted course are obtained. Further, in the embodiment, the ship track data is obtained by inputting the navigational speed predicted value and the heading predicted value into the ship track dynamics equation model for solving. Specifically, the ship trajectory dynamics equation model in the present embodiment is shown in the following formulas (1) to (5):
Figure BDA0004071099730000061
β=4.635×10 -6 (4)
Figure BDA0004071099730000062
Wherein in the formula I on0 Initial value of longitude, l at0 Is the initial value of latitude. If the speed function speed (t), the heading function coarse (t) and the initial longitude l of the ship are known on0 And an initial latitude l at0 The position of the vessel at any moment can be calculated.
Further, in the embodiment, the ship track data is input into a preset iteration block for iteration to obtain a ship coordinate prediction sequence. Specifically, referring to fig. 3, the present embodiment constructs a preset iteration block 310 according to a ship trajectory dynamics equation model 311 and a double hidden layer radial basis function network model 312. The formulas of the double hidden layer radial basis function network model 312 in this embodiment are shown in the following formulas (6) to (8):
h 1 =K(x;C,σ) (6)
h 2 =sigmoid(W 2 h 1 +b 2 ) (7)
y=W 3 h 2 ten b 3 (8)
Wherein x is a vector consisting of longitude and latitude, y is a navigational speed or a navigational course, C is the center of a base function, and sigma is the width of the base function; h is a 1 An output value of the first hidden layer, h 2 Is the output value of the second hidden layer, W 2 B is the weight of the second hidden layer 2 Is the bias value of the second hidden layer, W 3 B is the weight of the output layer 3 For the bias value of the output layer, K () is a gaussian basis function as a basis function, as shown in the following formula (9):
Figure BDA0004071099730000063
/>
further, in this embodiment, fusion is performed according to the ship radar data and the video monitoring data, so as to obtain video target image data. Specifically, in this embodiment, ship radar data is acquired through a radar subsystem, which includes millimeter wave radar and point cloud signal processing of the millimeter wave radar. Meanwhile, in the embodiment, video monitoring is performed on the wharf guard area through a 4K visible light and infrared double-light extremely-low-illumination camera, video monitoring data are obtained, and video pictures are displayed on a monitoring interface. Then, the embodiment fuses the ship radar data and the video monitoring data to obtain video target image data. Then, the embodiment fuses the ship coordinate prediction sequence and the video target image data to obtain a ship prediction track image. According to the method, the predicted ship longitude and latitude prediction sequence, namely the ship coordinate prediction sequence, is fused with the video target image data to obtain the ship predicted track image, and the ship track is combined with the ship space image, so that the ship movement direction can be accurately predicted. Further, in the embodiment, the ship collision early warning is performed according to the ship prediction track image, so as to obtain collision early warning information. According to the method, the device and the system, the ship collision early warning is carried out through the ship prediction track image fused with the multi-source information, the early warning and monitoring of the ship berthing are achieved, and the occurrence of ship berthing collision accidents is effectively reduced.
It should be noted that, in some embodiments of the present invention, the present embodiment performs the multi-source information fusion processing, collision early warning and directional broadcasting through the ship display alarm subsystem. Meanwhile, the ship display alarm subsystem is used for carrying out sound and light alarm, an electronic chart display module, satellite picture display and AIS ship target display function support daily work of a wharf.
In some embodiments of the present invention, the ship automatic identification system data is preprocessed, including, but not limited to:
and screening according to the data of the automatic ship identification system to obtain abnormal data. Wherein the abnormal data includes error data and missing data.
And classifying the abnormal data to obtain a data classification result.
And removing the error data according to the data classification result.
And repairing the missing data according to the data classification result.
In this embodiment, first, abnormal data is obtained by screening according to the automatic ship identification system data. Then, the embodiment classifies the abnormal data to obtain a data classification result, performs a rejection operation on the error data according to the data classification result, and performs a repair operation on the missing data according to the data classification result, thereby obtaining ship pretreatment data. Specifically, the abnormal data in the present embodiment includes error data and missing data. The present embodiment classifies the identified abnormal data into an error state and a missing state, i.e., error data and missing data. Then, the embodiment classifies the types of the error data and the missing data to obtain corresponding data classification results. Further, according to the embodiment, corresponding rejecting operation or repairing operation is performed according to different types of abnormal data, so that ship pretreatment data are constructed and obtained.
In some embodiments of the present invention, the pre-set neural network includes a convolutional layer, a pooling layer, and a fully-connected layer. Accordingly, in this embodiment, the predicted speed and heading of the ship are predicted by a preset neural network according to the ship preprocessing data, including but not limited to:
and inputting the ship pretreatment data into a convolution layer for first treatment to obtain local region characteristic data.
And inputting the local region characteristic data into a pooling layer for second processing to obtain characteristic mapping data.
And inputting the feature mapping data into the full connection layer for third processing to obtain a navigation speed predicted value and a heading predicted value. Wherein the third process includes concatenation, combination, and operation.
In this embodiment, the preset neural network is composed of a convolutional layer, a pooling layer, and a full-connection layer. In the embodiment, first, the ship pretreatment data is subjected to first processing through a convolution layer to obtain local area characteristic data, and then the local area characteristic data is input into a pooling layer to be subjected to second processing to obtain characteristic mapping data. Then, the implementation carries out third processing on the feature mapping data through the full connection layer to obtain a navigation speed predicted value and a navigation direction predicted value. Meanwhile, the third processing in the present embodiment includes connection, combination, and arithmetic processing. Specifically, the convolution layer in this embodiment includes a convolution kernel and an activation function. In this embodiment, the convolution layer performs convolution operation on the ship pretreatment data to extract the corresponding local area characteristic data. Wherein each convolution kernel contains a plurality of neurons therein, and different convolution kernel sizes correspond to different feature extractors. In this embodiment, the pooling layer is disposed behind the convolution layer, and an output of the convolution layer is connected to an input of the pooling layer. The embodiment obtains feature mapping data by pooling the feature data of the local area, such as feature selection, feature quantity reduction and parameter quantity reduction. The second process in this embodiment includes a maximum pooling process and an average pooling process. Further, in this embodiment, the full connection layer is located at the top layer of the preset neural network, and the input end of the full connection layer is connected with the output end of the pooling layer. In this embodiment, the feature mapping data obtained by pooling is connected, combined and operated through the full connection layer, so as to output the final result, that is, the navigational speed predicted value and the navigational direction predicted value. It is easy to understand that in this embodiment, the predicted value and the predicted heading value are tamped and input into the preset neural network again, and iterated for a plurality of times, so that the predicted speed sequence and the predicted heading sequence at the future time can be obtained, and the predicted speed predicted value and the predicted heading value of the preset times are obtained.
In some embodiments of the present invention, the video target image data is obtained by fusing ship radar data and video surveillance data, including but not limited to:
and calculating according to the ship radar data through the Euclidean distance algorithm and the clustering algorithm to obtain ship space data.
And identifying the video monitoring data through a preset target detection algorithm to obtain a ship target.
And obtaining ship position data according to the mapping relation of the ship target through the video image space and the geographic space.
And fusing the ship space data and the ship position data to obtain video target image data.
In this embodiment, first, according to ship radar data, ship space data is obtained through calculation according to a euclidean distance algorithm and a clustering algorithm, and then, video monitoring data is identified through a preset target detection algorithm, so that a ship target is obtained. Then, according to the embodiment, the ship position data is obtained according to the ship target through the mapping relation between the video image space and the geographic space, so that the video target image data is obtained by fusion according to the ship space data and the ship position data. Specifically, in this embodiment, the point cloud data acquired by the radar subsystem, that is, the ship radar data, is calculated by using the euclidean distance algorithm and the clustering algorithm, so as to obtain ship space data, such as the volume, the length, the width, the height, and the like of the ship. Then, the embodiment identifies the video monitoring data through a preset target detection algorithm to obtain the ship target in the video monitoring data. Illustratively, in this embodiment, after video monitoring data is obtained through camera recording, a ship target is identified from the video monitoring data through yolov3 algorithm. The yolov3 algorithm is composed of different network layers, and in this embodiment, the network layers include an encoding layer and a decoding layer, where the encoding layer is composed of a plurality of convolution layers, an activation function and a pooling layer, and is used to propose high-order abstract features in an image. Meanwhile, the decoding layer comprises a convolution layer, a pooling layer and a full-connection layer, and the decoding layer outputs a prediction result, namely outputs a ship target obtained through recognition through learning the characteristics of the data transmitted by the encoding layer. Further, according to the embodiment, specific position information of the ship, namely ship position data, is identified according to the identified ship target through the mapping relation between the video image space and the geographic space of the video monitoring data. Then, the present embodiment fuses the ship space data and the ship position data, thereby obtaining video target image data. It is easy to understand that in the embodiment, the video target image data combines the space data of the ship and the position information of the ship, so that the accuracy of the ship berthing early warning can be effectively improved.
In some embodiments of the present invention, the ship space data is calculated according to the ship radar data through the euclidean distance algorithm and the clustering algorithm, including but not limited to:
and constructing a point cloud cluster through a tree data structure according to the ship radar data.
Dividing the point cloud clusters according to the Euclidean distance algorithm to obtain subset cloud clusters;
and calculating the distance between each cloud point of the first cloud cluster in the subset cloud clusters and a preset cloud point to obtain point distance data. The preset cloud points comprise cloud points in a first cloud cluster and cloud points in a second cloud cluster adjacent to the first cloud cluster.
And carrying out cluster analysis on cloud points in the subset cloud clusters according to the point distance data to obtain a cluster result.
And estimating the geometric center of the clustering result to obtain geometric center data.
And calculating to obtain the ship space data of the clustering result through the geometric center data.
In this embodiment, the point cloud cluster is first constructed according to the ship Leda data through a tree data structure. Specifically, in this embodiment, after ship radar data is obtained through the millimeter wave radar, the ship radar data is input into a tree data structure (Kdtree structure) to establish a corresponding topological relation between discrete point cloud data points, so as to obtain a point cloud cluster M, thereby enabling rapid search of a field point set. The Kdtree is a multidimensional binary tree structure and can organize and store high-dimensional data, the Kdtree can divide a data space into a plurality of subspaces, each subspace is mutually disjoint, and each node on the binary tree divides the data space into two subspaces. If the data contained in each node is less than the pre-designed lower numerical limit, then the smaller data space is no longer divided. Meanwhile, since the point cloud signal is three-dimensional space data, the three-dimensional information is usually processed. Then, the embodiment divides the point cloud clusters into subset cloud clusters according to the space vector according to the Euclidean distance algorithm. For example, according to the euclidean distance algorithm, the embodiment uses the point cloud cluster M according to the spatial vector d= { d 1 ,d 2 ,…,d n Dividing into different subset point cloud clusters m= { M 1 ,M 2 ,…,M n }. Feeding inIn one step, the embodiment obtains point distance data by calculating the distance between each cloud point of the first cloud cluster in the subset cloud clusters and a preset cloud point. The preset cloud points comprise cloud points in a first cloud cluster and cloud points in a second cloud cluster adjacent to the first cloud cluster, namely, the distances between the cloud points of the first cloud cluster in the subset cloud cluster and other cloud points except the cloud points in the first cloud cluster and the distances between the cloud points of the second cloud cluster adjacent to the first cloud cluster are calculated. Illustratively, the present embodiment first creates an empty cluster list R to hold the result index of the clusters. Then, the present embodiment is based on each point cloud cluster M i ={P 1 ,P 2 ,…,P i Each cloud point P in the first cloud cluster i Coordinate data (x) pi ,y pi ) Solving for it at M i And M i The distance D between adjacent point cloud clusters, namely common cloud points in the second cloud cluster, wherein the solving formula is as follows:
Figure BDA0004071099730000101
further, in this embodiment, cluster analysis is performed on cloud points in the subset cloud clusters according to the point distance data, so as to obtain a clustering result. Specifically, in the present embodiment, if the relation D.ltoreq.d is present i Then it can be determined that the two points belong to the same feature cluster and the two points are marked as feature cluster r i . In addition, when the point set in each adjacent area completes the process, the influence caused by error points should be removed. The embodiment sets the minimum point number P min Wherein r is<P min For error point collection, if the point number P in r meets the condition that P is more than or equal to P min Will r i Added to R. Repeating the process, each M i ∈{P 1 ,P 2 ,…,P i Traversing once, combining the categories according to the Euclidean distance of all clusters in R until the Euclidean distance between all clusters in the list R is greater than the distance threshold d i . Wherein d i Calculated by the following formula (11):
Figure BDA0004071099730000102
wherein D in the formula i ∈{D 1 ,D 2 ,…,D n The value of the distance dividing region, ζ is a relaxation coefficient, the size of which is related to the downsampled value in the data preprocessing process,
Figure BDA0004071099730000103
the horizontal resolution angle of the lidar is a constant value at a constant rotation frequency, depending on the physical structure of the lidar. So far, the embodiment obtains the final clustering result list R.
Further, in this embodiment, geometric centers of the clustering results are estimated to obtain geometric center data, and ship space data of the clustering results are obtained through calculation of the geometric center data. Specifically, the clustering result R obtained in the present embodiment i E R. In the embodiment, the geometric center of the clustering result obtained by each clustering is calculated to obtain the geometric center data of each clustering result. Next, in this embodiment, the dimensions of each clustered result after clustering are calculated by the following equation (12), and ship space data is obtained.
Figure BDA0004071099730000104
Wherein V is in the formula n 、l n 、d n 、h n Representing the volume, length, width and height of the nth clustering result respectively,
Figure BDA0004071099730000105
Figure BDA0004071099730000106
respectively, the maximum value of x coordinate, the minimum value of x coordinate,/-in the nth clustering result>
Figure BDA0004071099730000107
Respectively, the maximum value of y coordinates, the minimum value of y coordinates, and +.>
Figure BDA0004071099730000108
The maximum value of the z coordinate and the minimum value of the z coordinate in the nth clustering result are respectively obtained.
In some embodiments of the present invention, the video target image data is obtained by fusing ship space data and ship position data, including but not limited to:
and performing time synchronization operation on the ship space data and the ship position data to obtain time synchronization data.
And carrying out coordinate system transformation operation according to the time synchronization data to obtain preset coordinate synchronization data. The preset coordinate synchronizing data comprise ship space coordinate synchronizing data and ship position coordinate synchronizing data.
And carrying out association matching according to the ship space coordinate synchronous data and the ship position coordinate synchronous data to obtain video target image data.
In this embodiment, the time synchronization operation is first performed on the ship space data and the ship position data to obtain time synchronization data. Specifically, in the time stamp alignment policy of the present embodiment, the time course of data processing by two different sensors is represented by time series 1 and time series 2, respectively. Wherein, the present embodiment respectively passes through t 1 And t 2 The average time of the sensor a algorithm processing and the average time of the sensor b algorithm processing in time series 1 are shown. When the system is from T 0 Start to work when time arrives at T 1 The data processing of sensor a is completed, but the processing result is not uploaded at this time according to the previous design of the system. Then, the time reaches T 2 At this point, the algorithmic processing of sensor b is complete, and the results of the two-part processing should be uploaded at the same time. At the same time arrive at T 3 Before, the two parts should be hung up respectively, wait for the next period to start, and the system should give up the CPU right of use in the period to guarantee the real-time performance of the system. When the time reaches T 3 When a new cycle starts, the above process is repeated, thereby ensuring time synchronization of the data. Further, in this embodiment, coordinate system transformation operation is performed according to the time synchronization data, so as to obtain preset coordinate synchronization data. Specifically, in this embodiment, the preset coordinate synchronization data includes ship space coordinate synchronization data and ship position coordinate synchronization data. Illustratively, in the coordinate system space transformation of the present embodiment, the radar coordinate system and the hull coordinate system are first subjected to rigid transformation, and then the internal reference matrix of the image coordinate system is unified, and after passing through the rigid body, the internal reference matrix and the point cloud coordinate system can be unified to the hull coordinate system and matched. In addition, in this embodiment, for the image target, the transformation from the geodetic coordinates to the pixel coordinates can be realized generally based on the principle of pinhole imaging. In order to obtain a larger visual angle, the embodiment adopts a four-eye camera, and the images are cut and spliced to form a whole image. In this embodiment, the one-dimensional coordinate axes based on the azimuth are fused, and the distortion correction of the image is also ignored. Based on the principle of pinhole imaging, for an image, one picture contains all image information within a viewing angle range, and different pixel positions are fixed relative to the camera in the viewing angle, so that the target azimuth coordinates are shown in the following formula (13):
Figure BDA0004071099730000111
Further, in this embodiment, association matching is performed according to the ship space coordinate synchronization data and the ship position coordinate synchronization data, so as to obtain video target image data. Specifically, in the feature association matching of the present embodiment, a detection frame in an image and a detection frame in a point cloud are required to be extracted, the detection frames of the point cloud are projected through a front view, a relative distance value between each target frame in the point cloud and each detection frame in the image is calculated, then a minimum value is reserved from all the obtained distances, and if the minimum value is smaller than a set minimum error value, association matching is performed, so that video target image data is obtained. It is easy to understand that in this embodiment, by fusing the video tracking target information with the AIS target prediction data information and associating the predicted continuous longitude and latitude sequence with the azimuth coordinates in the image, the predicted track image of the ship, that is, the video target image data, can be displayed in the CCTV (closed circuit television monitoring system) video.
In some embodiments of the present invention, a ship collision pre-warning is performed according to a ship predicted track image, so as to obtain collision pre-warning information, including but not limited to:
And constructing a ship collision early warning model according to the collision time.
And inputting the ship predicted track image into a ship collision early warning model for prediction to obtain collision early warning information.
In this embodiment, the ship collision early warning model is first constructed according to the collision time. Specifically, the present embodiment constructs a ship collision early warning model by a Time To Collision (TTC) model. The TTC model is a method for early warning according to the relation between the distance between the ship and the dock target and the speed ratio, and the expression is shown in the following formula (14):
Figure BDA0004071099730000121
wherein t is 1 、t 2 Representing two different moments, d 1 、d 2 Respectively indicated at t 1 、t 2 And the relative distance between the ship and the wharf is kept at the moment.
Further, in the embodiment, the ship predicted track image is input into a ship collision early warning model for prediction, so that collision early warning information is obtained. Specifically, in the present embodiment, in collision early warning, according to t TTC Different thresholds are set for the values of (2) and the early warning is classified into three levels. Wherein when V d >0, in which case t is the time the ship and quay spacing increases TTC >0 is constant, so no warning is needed. While when V d When less than or equal to 0, according to t TTC The early warning levels are divided into three levels, and hierarchical early warning is carried out. Illustratively, when t TTC When the value is greater than 0, the early warning systemAnd the system outputs 0, and early warning is not carried out. When-120<t TTC Outputting a first-level early warning when the temperature is less than or equal to 0; when-300<t TTC And outputting a secondary early warning when the temperature is less than or equal to-120. When t TTC And outputting three-level early warning when the temperature is less than or equal to-300. Correspondingly, the primary early warning in the embodiment comprises broadcasting of a normal speech speed call and sending of primary warning information by AIS short messages, wherein broadcasting is not directional. In this embodiment, the secondary alarm includes broadcasting a faster voice call, and sending secondary alarm information through an AIS short message. Meanwhile, in a certain area, broadcasting and tracking the target ship broadcast. In this embodiment, the three-level alarm includes three-level alarm information including a broadcast fastest call, AIS sms transmission, and VHF voice transmission. Meanwhile, in a certain area, broadcasting and tracking the target ship broadcast. In addition, it is easy to understand that the present embodiment blinks the searchlight toward the water surface in the advancing direction of the ship (without directly irradiating the ship) at night to make the travel route indication on the water surface score line.
It should be noted that, in some embodiments of the present invention, when directional broadcasting is performed on a ship with berthing risk, the microphone and the broadcasting system are supported to manually broadcast voice information and other information at any client, and only one client can sound at the same time, so as to support a preemption mechanism and a maximum duration limit, and exceed the limit duration, and automatically release voice control, so as to give preemption opportunities to other clients. In addition, the embodiment supports manual adjustment of the broadcast cradle head to point to a specific direction or target. Wherein the broadcast may transmit sound at least 1000 meters in a windless state. In the embodiment, the audible and visual alarm searchlight can be remotely controlled to be turned on, and can be turned along with the cradle head to point to a specific direction or target. Likewise, the alarm in this embodiment may be turned on by remote control to flash the alarm lamp and sound the alarm. According to the embodiment, the VHF communication equipment is used for supporting communication information of 16 channels or other channels at any client, supporting communication or broadcasting warning information by manually calling a ship through the licensed channels at the licensed clients, supporting maximum speaking duration limitation, exceeding a duration limitation value, automatically releasing control to allow other clients to preempt sounding, and automatically broadcasting warning driving information or other information through the VHF licensed channels.
It should be noted that, in some embodiments of the present invention, an electronic chart display module is provided in this embodiment. The electronic chart display module is used for supporting loading and updating of the IHOS57 format, the IHOS-63 format standard electronic chart and the CJ-57 format electronic channel chart. Meanwhile, the electronic chart display module of the embodiment supports displaying charts in a vector mode according to the requirements of IHOS-52611. In addition, the electronic chart display module of the embodiment includes, but is not limited to, switching of daytime, morning sickness, night color modes, switching of basic, standard and full display groups, custom display groups, and the like, and supports basic operations such as zooming in, zooming out, panning, rotating, and the like of charts. The embodiment supports that when the display scale is changed, the chart with the proper editing scale is automatically displayed, and attribute information of any object on the chart can be queried. In addition, the present embodiment is also capable of plotting symbols such as points, lines, planes, rectangles, ellipses, and the like on a chart, supporting plotting chart-specific symbols, and capable of performing measurements including measurements of distance, azimuth, area, and the like on a chart.
It should be noted that, in some embodiments of the present invention, the monitored water area is displayed in a satellite picture manner. The satellite picture and the electronic chart are registered, and the coordinates are consistent. Meanwhile, the embodiment can perform multi-stage enlargement and reduction, and support translation and rotation. The satellite picture and the electronic chart share one set of monitoring area setting, and can be switched between a chart display mode and a satellite picture display mode.
It should be noted that, in some embodiments of the present invention, an AIS ship target display module receives an AIS signal of a ship within a range of at least 2 km radius with a center point of a wharf south edge as a center, and displays the AIS ship target on an electronic chart display module or a satellite chart. Meanwhile, the AIS ship target display module supports display of AIS ship target information in a list mode, and AIS static information and real-time changing dynamic information of the ship are checked in an information window by clicking the AIS target. In addition, in this embodiment, the AIS target may be marked on the chart by chinese name, english name, call sign, MMSI number, etc., and whether to display the trail, speed vector line, heading, turning prompt, etc. of the AIS ship target may be selected.
It should be noted that, in some embodiments of the present invention, an information management subsystem is further provided for adding users, deleting users, modifying user rights, editing information of ships in the ship library, including chinese names, etc., and entering and editing a ship whitelist. Wherein, the ships in the white list enter the warning area and cannot alarm.
It should be noted that, in some embodiments of the present invention, a record playback unit is provided for selecting a playback time range, where the system may read the stored history information of the AIS and radar targets and the video history data, and completely play back the current situation on the same interface as the monitoring interface of the system, including the navigation track of the ship, the video recorded by the camera and the generated alarm information. The corresponding alarm information is played back at the moment without sending through broadcasting, AIS or VHF, or lighting the searchlight and triggering the alarm, but the playback device has alarm sound including synthesized voice alarm played.
It should be noted that, in some embodiments of the present invention, the AIS information service is provided to the client through the central service unit. Meanwhile, the embodiment responds to the warning zone setting request and stores the warning zone setting so as to provide the warning zone inquiring service for the client. Correspondingly, the embodiment judges whether the radar target and the AIS target enter the warning area in real time and provides the hierarchical alarm information service for the client. Then, the central service unit of the embodiment can also respond to the history playback request, query the radar target and the AIS target history record, broadcast the radar target and the AIS target data to the client according to the speed, respond to the requests of authority management, information management and the like, save the data, and save and backup the data.
One embodiment of the present invention further provides a ship berthing collision early warning system, including:
and the acquisition module is used for acquiring the data of the ship automatic identification system.
The pretreatment module is used for carrying out pretreatment on the ship automatic identification system data to obtain ship pretreatment data.
And the prediction module is used for predicting and obtaining a navigational speed predicted value and a navigational direction predicted value of the ship through a preset neural network according to the ship pretreatment data.
And the resolving module is used for inputting the navigational speed predicted value and the heading predicted value into the ship track dynamics equation model to be resolved, so as to obtain ship track data.
The iteration module is used for inputting the ship track data into a preset iteration block for iteration to obtain a ship coordinate prediction sequence. And constructing a preset iteration block according to the ship track dynamics equation model and the double hidden layer radial basis function network model.
And the first fusion module is used for carrying out fusion according to the ship radar data and the video monitoring data to obtain video target image data.
And the second fusion module is used for fusing the ship coordinate prediction sequence and the video target image data to obtain a ship prediction track image.
And the early warning module is used for carrying out ship collision early warning according to the ship prediction track image to obtain collision early warning information.
Referring to fig. 2, an embodiment of the present invention further provides a ship berthing collision early warning system, including:
at least one processor 210.
At least one memory 220 for storing at least one program.
The at least one program, when executed by the at least one processor 210, causes the at least one processor 210 to implement the ship berthing collision warning method as described in the above embodiments.
An embodiment of the present invention also provides a computer-readable storage medium storing computer-executable instructions for execution by one or more control processors, e.g., to perform the steps described in the above embodiments.
Those of ordinary skill in the art will appreciate that all or some of the steps, systems, and methods disclosed above may be implemented as software, firmware, hardware, and suitable combinations thereof. Some or all of the physical components may be implemented as software executed by a processor, such as a central processing unit, digital signal processor, or microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit. Such software may be distributed on computer readable media, which may include computer storage media (or non-transitory media) and communication media (or transitory media). The term computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data, as known to those skilled in the art. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by a computer. Furthermore, as is well known to those of ordinary skill in the art, communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media.
While the preferred embodiment of the present invention has been described in detail, the present invention is not limited to the above embodiment, and various equivalent modifications and substitutions can be made by those skilled in the art without departing from the spirit of the present invention, and these equivalent modifications and substitutions are intended to be included in the scope of the present invention as defined in the appended claims.

Claims (10)

1. The ship berthing collision early warning method is characterized by comprising the following steps of:
acquiring data of an automatic ship identification system;
preprocessing the data of the automatic ship identification system to obtain ship preprocessing data;
predicting a predicted speed value and a predicted heading value of the ship through a preset neural network according to the ship pretreatment data;
inputting the navigational speed predicted value and the heading predicted value into a ship track dynamics equation model for solving to obtain ship track data;
inputting the ship track data into a preset iteration block for iteration to obtain a ship coordinate prediction sequence; the preset iteration block is constructed according to the ship track dynamics equation model and the double hidden layer radial basis function network model;
fusing the ship radar data and the video monitoring data to obtain video target image data;
Fusing the ship coordinate prediction sequence and the video target image data to obtain a ship prediction track image;
and carrying out ship collision early warning according to the ship prediction track image to obtain collision early warning information.
2. The ship berthing collision warning method according to claim 1, wherein the preprocessing the ship automatic identification system data comprises:
screening according to the data of the automatic ship identification system to obtain abnormal data; wherein the abnormal data includes error data and missing data;
classifying the abnormal data to obtain a data classification result;
performing a rejection operation on the error data according to the data classification result;
and repairing the missing data according to the data classification result.
3. The ship berthing collision early warning method according to claim 1, wherein the preset neural network comprises a convolution layer, a pooling layer and a full connection layer;
the predicting, according to the ship pretreatment data, the predicted speed and heading of the ship through a preset neural network includes:
inputting the ship pretreatment data into the convolution layer for first treatment to obtain local area characteristic data;
Inputting the local region characteristic data into the pooling layer for second processing to obtain characteristic mapping data;
inputting the feature mapping data into the full connection layer for third processing to obtain the navigational speed predicted value and the heading predicted value; wherein the third process includes concatenation, combination, and operation.
4. The ship berthing collision pre-warning method according to claim 1, wherein the fusing according to the ship radar data and the video monitoring data to obtain video target image data comprises:
calculating to obtain ship space data through Euclidean distance algorithm and clustering algorithm according to the ship radar data;
identifying the video monitoring data through a preset target detection algorithm to obtain a ship target;
obtaining ship position data according to the ship target through the mapping relation between the video image space and the geographic space;
and fusing the ship space data and the ship position data to obtain the video target image data.
5. The ship berthing collision pre-warning method according to claim 4, wherein the calculating according to the ship radar data through euclidean distance algorithm and clustering algorithm to obtain ship space data comprises:
Constructing a point cloud cluster through a tree data structure according to the ship radar data;
dividing the point cloud clusters according to a Euclidean distance algorithm to obtain subset cloud clusters;
calculating the distance between each cloud point of the first cloud cluster in the subset Yun Cu and a preset cloud point to obtain point distance data; the preset cloud points comprise the cloud points in the first cloud cluster and the cloud points in a second cloud cluster adjacent to the first cloud cluster;
performing cluster analysis on the cloud points in the subset cloud clusters according to the point distance data to obtain a clustering result;
estimating the geometric center of the clustering result to obtain geometric center data;
and calculating the ship space data of the clustering result through the geometric center data.
6. The ship berthing collision pre-warning method according to claim 4, wherein the fusing according to the ship space data and the ship position data to obtain the video target image data comprises:
performing time synchronization operation on the ship space data and the ship position data to obtain time synchronization data;
performing coordinate system transformation operation according to the time synchronization data to obtain preset coordinate synchronization data; the preset coordinate synchronization data comprise ship space coordinate synchronization data and ship position coordinate synchronization data;
And carrying out association matching according to the ship space coordinate synchronous data and the ship position coordinate synchronous data to obtain the video target image data.
7. The ship berthing collision pre-warning method according to claim 1, wherein the performing the ship collision pre-warning according to the ship predicted track image to obtain collision pre-warning information comprises:
constructing a ship collision early warning model according to the collision time;
and inputting the ship predicted track image into the ship collision early warning model for prediction to obtain the collision early warning information.
8. A marine berthing collision warning system, comprising:
the acquisition module is used for acquiring the data of the automatic ship identification system;
the pretreatment module is used for carrying out pretreatment on the ship automatic identification system data to obtain ship pretreatment data;
the prediction module is used for obtaining a predicted navigational speed value and a predicted navigational heading value of the ship through a preset neural network according to the ship pretreatment data;
the calculation module is used for inputting the navigational speed predicted value and the heading predicted value into a ship track dynamics equation model to be solved, so as to obtain ship track data;
The iteration module is used for inputting the ship track data into a preset iteration block for iteration to obtain a ship coordinate prediction sequence; the preset iteration block is constructed according to the ship track dynamics equation model and the double hidden layer radial basis function network model;
the first fusion module is used for fusing the ship radar data and the video monitoring data to obtain video target image data;
the second fusion module is used for fusing the ship coordinate prediction sequence and the video target image data to obtain a ship prediction track image;
and the early warning module is used for carrying out ship collision early warning according to the ship prediction track image to obtain collision early warning information.
9. A marine berthing collision warning system, comprising:
at least one processor;
at least one memory for storing at least one program;
when the at least one program is executed by the at least one processor, the at least one processor is caused to implement the ship berthing collision warning method as claimed in any one of claims 1 to 7.
10. A computer storage medium in which a processor-executable program is stored, characterized in that the processor-executable program, when executed by the processor, is for implementing the ship berthing collision warning method according to any one of claims 1 to 7.
CN202310093766.8A 2023-01-18 2023-01-18 Ship berthing collision early warning method, system and storage medium Pending CN116110255A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310093766.8A CN116110255A (en) 2023-01-18 2023-01-18 Ship berthing collision early warning method, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310093766.8A CN116110255A (en) 2023-01-18 2023-01-18 Ship berthing collision early warning method, system and storage medium

Publications (1)

Publication Number Publication Date
CN116110255A true CN116110255A (en) 2023-05-12

Family

ID=86267081

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310093766.8A Pending CN116110255A (en) 2023-01-18 2023-01-18 Ship berthing collision early warning method, system and storage medium

Country Status (1)

Country Link
CN (1) CN116110255A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116611743A (en) * 2023-07-17 2023-08-18 华航检测认证(青岛)有限公司 Building engineering construction quality evaluation method based on big data
CN117914953A (en) * 2024-03-20 2024-04-19 中国船级社 Ship data processing method, device and equipment
CN118260659A (en) * 2024-05-31 2024-06-28 中国海洋大学 All-weather dock ship berthing detection system based on multi-sensor acquisition and analysis

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116611743A (en) * 2023-07-17 2023-08-18 华航检测认证(青岛)有限公司 Building engineering construction quality evaluation method based on big data
CN116611743B (en) * 2023-07-17 2023-10-10 华航检测认证(青岛)有限公司 Building engineering construction quality evaluation method based on big data
CN117914953A (en) * 2024-03-20 2024-04-19 中国船级社 Ship data processing method, device and equipment
CN117914953B (en) * 2024-03-20 2024-06-07 中国船级社 Ship data processing method, device and equipment
CN118260659A (en) * 2024-05-31 2024-06-28 中国海洋大学 All-weather dock ship berthing detection system based on multi-sensor acquisition and analysis
CN118260659B (en) * 2024-05-31 2024-08-16 中国海洋大学 All-weather dock ship berthing detection system based on multi-sensor acquisition and analysis

Similar Documents

Publication Publication Date Title
CN116110255A (en) Ship berthing collision early warning method, system and storage medium
US20210224512A1 (en) Danet-based drone patrol and inspection system for coastline floating garbage
US10540554B2 (en) Real-time detection of traffic situation
CN113008263B (en) Data generation method and data generation device
JP7266668B2 (en) Video object fast detection method, apparatus, server and storage medium
CN111899450B (en) Method and system for monitoring ships entering and exiting port and finding dangerous ships
US7889232B2 (en) Method and system for surveillance of vessels
CN112257609B (en) Vehicle detection method and device based on self-adaptive key point heat map
CN104660993B (en) Maritime affairs intelligent control method and system based on AIS and CCTV
CN115004269B (en) Monitoring device, monitoring method, and program
CN111814753A (en) Target detection method and device under foggy weather condition
CN111126734A (en) Offshore wind farm dispatching management system
CN115908442A (en) Image panorama segmentation method for unmanned aerial vehicle ocean monitoring and model building method
CN112562417A (en) Ship emergency command management system and method
CN112462774A (en) Urban road supervision method and system based on unmanned aerial vehicle navigation following and readable storage medium
CN110555378B (en) Live video-based weather prediction method and system and weather prediction device
CN112506219A (en) Intelligent traffic supervision unmanned aerial vehicle track planning method and system and readable storage medium
CN114067142A (en) Method for realizing scene structure prediction, target detection and lane level positioning
CN113450459B (en) Method and device for constructing three-dimensional model of target object
Zhang et al. A warning framework for avoiding vessel‐bridge and vessel‐vessel collisions based on generative adversarial and dual‐task networks
US20240257526A1 (en) Monitoring method and apparatus, and unmanned vehicle and monitoring device
CN113822217A (en) Ship tail gas monitoring method based on AIS and video image analysis
CN115311900A (en) Inland waterway ship auxiliary target identification system and method based on visual enhancement
CN114419444A (en) Lightweight high-resolution bird group identification method based on deep learning network
Cafaro et al. Towards Enhanced Support for Ship Sailing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination