CN116233782A - Multi-sensor-based unmanned ship sea surface data acquisition time synchronization method - Google Patents

Multi-sensor-based unmanned ship sea surface data acquisition time synchronization method Download PDF

Info

Publication number
CN116233782A
CN116233782A CN202310059293.XA CN202310059293A CN116233782A CN 116233782 A CN116233782 A CN 116233782A CN 202310059293 A CN202310059293 A CN 202310059293A CN 116233782 A CN116233782 A CN 116233782A
Authority
CN
China
Prior art keywords
sensor
point cloud
frame
sampling
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310059293.XA
Other languages
Chinese (zh)
Inventor
周洋
邵世林
李政霖
彭艳
谢少荣
罗均
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Shanghai for Science and Technology
Original Assignee
University of Shanghai for Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Shanghai for Science and Technology filed Critical University of Shanghai for Science and Technology
Priority to CN202310059293.XA priority Critical patent/CN116233782A/en
Publication of CN116233782A publication Critical patent/CN116233782A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/38Services specially adapted for particular environments, situations or purposes for collecting sensor information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W56/00Synchronisation arrangements
    • H04W56/001Synchronization between nodes
    • H04W56/0015Synchronization between nodes one node acting as a reference for the others
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W84/00Network topologies
    • H04W84/18Self-organising networks, e.g. ad-hoc networks or sensor networks
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

The invention relates to the technical field of sensor data acquisition, and discloses a multi-sensor-based unmanned ship sea surface data acquisition time synchronization method, which comprises the following specific steps: s1: setting a fixed sampling frequency based on the data sampling frequency range of each sensor, adjusting the sampling frequency of each sensor to be consistent with the fixed sampling frequency, and sampling by adopting the adjusted sensor; for the sensor with the sampling frequency range lower than the fixed sampling frequency, sampling according to the original sampling frequency of the sensor, and then carrying out frame inserting processing on the acquired sampling data, so that the number of the sampling data after frame inserting processing is the same as that of the sensor for carrying out data acquisition according to the fixed sampling frequency; s2: and acquiring UTC time of the unmanned ship GPS sensor, and adding a UTC time stamp of the GPS to each frame of sampling data acquired by each sensor. The method can ensure the time synchronization of the sampling data acquired by different sensors of the unmanned ship, and can more effectively and accurately realize the height positioning and tracking of the unmanned ship and the target.

Description

Multi-sensor-based unmanned ship sea surface data acquisition time synchronization method
Technical Field
The invention relates to the technical field of sensor data acquisition, in particular to a multi-sensor-based unmanned ship sea surface data acquisition time synchronization method.
Background
The unmanned ship is a water surface platform with certain autonomous capability, can independently navigate and complete certain combat and operation tasks under the condition of not carrying operators, has the characteristics of low cost, hidden operation, flexibility and the like, and has great application value in the military field. The unmanned ship can be used for performing various combat tasks such as environmental investigation, anti-diving detection tracking, water mine combat monitoring, search and rescue, material transportation and the like.
The unmanned ship completes various tasks through sensing and decision making through various sensors mounted on the ship, including photoelectric pods, laser radars, maritime radars, global Positioning Systems (GPS) and the like, wherein the most important task is the sensing task of the unmanned ship, and corresponding data support can be provided for decision making only through sensing, so that behavior control of the unmanned ship is realized. The scheme of finishing the perception task by using a single sensor is mature, the detection of the target is finished by using a visual sensor, the recognition and tracking of the target are finished by using a laser radar, the positioning and tracking of the target are finished by using a marine radar, the perception scheme of the single sensor is very successful when applied to an unmanned ship, but the situation on the sea is very complex, the weather on the sea is changeable, the scenes such as rainfall, fog and strong light irradiation, reflection of the sunlight on the water surface and the like are met, and under the complex sea surface scenes, different sensors have different defects and advantages respectively, for example, the situation that the visual recognition of the target is inaccurate or cannot be realized in foggy days and rainy days can appear in a photoelectric pod, but the detection and tracking task can still be finished well in the weather and foggy days by using the marine radar and the laser radar. The maritime radar and the laser radar are various in scene and difficult to identify different targets under the condition of complex conditions, but the photoelectric pod can easily and respectively identify different targets. How to better give the advantages of different sensors to better play, we need to use a scheme of multi-sensor fusion to fully play the advantages of the respective sensors to face different complex and difficult scenes.
The multi-sensor fusion processes the information complementation and optimization combination of various sensors in multiple layers and multiple spaces, and finally produces consistency interpretation of the observation environment, in the process, multi-source data are fully utilized for reasonable support and use, the final target of the information fusion is to obtain separated observation information based on the sensors, and more useful information is obtained by combining the information in multiple layers and multiple aspects. In order to realize effective fusion of multi-source data, firstly, high integration of multiple sensors is needed, and multiple source data of all sensors are unified to the same time and space reference by using a certain technical means, so that synchronization of the data of all sensors is guaranteed, and therefore, the unmanned ship and the target can be more effectively and accurately positioned and tracked, the spatial reference can be unified to obtain the relative position relation of all the sensors through a calibration technology, and the time reference is unified to ensure that absolute time precision of an acquisition system is within a certain error range and the sensor data can be synchronously acquired in ultra-low time delay. The simple adoption of the system time can not perform standard alignment on multi-source data of a plurality of sensors to a certain extent, and meanwhile, the condition that the multi-boat multi-sensor data drift occurs back after the program runs for a certain time by using the system time influences the subsequent fusion of the multi-source data to a certain extent.
Disclosure of Invention
Aiming at the problems and the defects existing in the prior art, the invention aims to provide an unmanned ship sea surface data acquisition time synchronization method based on multiple sensors.
In order to achieve the aim of the invention, the technical scheme adopted by the invention is as follows:
a sea surface data acquisition time synchronization method of an unmanned ship based on multiple sensors comprises the following steps:
s1: setting a fixed sampling frequency based on a data sampling frequency range of each sensor used for acquiring sea surface data of the unmanned ship, adjusting the sampling frequency of each sensor to be consistent with the fixed sampling frequency, and sampling by adopting the adjusted sensors; for the sensor with the sampling frequency range lower than the fixed sampling frequency, sampling according to the original sampling frequency of the sensor, and then carrying out frame inserting processing on the acquired sampling data, so that the number of the sampling data after frame inserting processing is the same as that of the sensor for carrying out data acquisition according to the fixed sampling frequency;
s2: and acquiring UTC time of the unmanned ship GPS sensor, and adding a UTC time stamp of the GPS to each frame of sampling data acquired by each sensor.
According to the above-mentioned unmanned ship sea surface data acquisition time synchronization method based on multiple sensors, preferably, in step S1, a PointINet (point cloud frame interpolation network, point Cloud Frame Interpolation Network, abbreviated as PointINet) is adopted to perform frame interpolation processing on the acquired sampled data. The method comprises the steps of carrying out a first treatment on the surface of the The PointINet network comprises a point cloud conversion module and a point fusion module, wherein the point cloud conversion module is used for predicting a bidirectional 3D scene flow between two continuous frames of point cloud data and converting the point cloud to a given time step according to the predicted 3D scene flow; the point fusion module comprises a self-adaptive sampling submodule, a self-adaptive KNN clustering submodule and an attention point fusion submodule.
According to the above multi-sensor-based unmanned ship sea surface data acquisition time synchronization method, preferably, the specific operation of performing frame interpolation processing on the acquired sampling data by using the PointINet network is as follows:
s11: p0 and P1 are set as any two continuous frames of sampling data acquired by the sensor,
Figure BDA0004060982600000031
for the point cloud obtained by carrying out the point cloud transformation on the P0 frame point cloud according to the time step length t, the method comprises the steps of +.>
Figure BDA0004060982600000032
In order to transform the P1 frame point cloud according to the time step length t, the point cloud is 0 < t < 1, < ->
Figure BDA0004060982600000033
After the point cloud data of the P0 frame and the P1 frame are subjected to frame inserting processing according to the time step tThe obtained intermediate point cloud data;
s12: setting the time step of a point cloud conversion module of the PointINet network as t, and inputting the point cloud data of the P0 frame into the point cloud conversion module of the PointINet network to perform point cloud conversion to obtain
Figure BDA0004060982600000034
A frame point cloud; inputting point cloud data of the P1 frame into a point cloud conversion module of the PointINet network to perform point cloud to obtain +.>
Figure BDA0004060982600000035
A frame point cloud;
s13: will be
Figure BDA0004060982600000036
Frame point cloud and->
Figure BDA0004060982600000037
And the frame point cloud is input into a point fusion module of the PointINet network to be fused, so that an intermediate point cloud Pt between the P0 frame and the P1 frame is generated.
According to the above-described multi-sensor-based unmanned ship sea surface data acquisition time synchronization method, preferably, in step S12,
Figure BDA0004060982600000038
frame, & gt>
Figure BDA0004060982600000039
The positions of the frame point clouds are:
Figure BDA00040609826000000310
Figure BDA00040609826000000311
F 0→t =t×F 0→1
F 1→t =(1-t)×F 1→0
F 0→1 representing movement of a point cloud from a P0 frame to
Figure BDA00040609826000000312
Point motion scene flow of frames, F 1→0 The represented point cloud moves from the P1 frame to
Figure BDA00040609826000000313
The point motion scene stream of frames.
According to the multi-sensor-based unmanned ship sea surface data acquisition time synchronization method, preferably, the sensors comprise photoelectric pods, laser radars, maritime radars, millimeter wave radars and GPS sensors.
According to the above multi-sensor-based unmanned ship sea surface data acquisition time synchronization method, preferably, in step S1, the sampling frequency of the optoelectronic pod, the laser radar, the millimeter wave radar and the GPS sensor is adjusted to a fixed sampling frequency; and the sampling frequency of the marine radar is lower than the fixed sampling frequency, and the frame inserting processing is carried out on the sampling data of the marine radar.
According to the multi-sensor-based unmanned ship sea surface data acquisition time synchronization method, preferably, the fixed sampling frequency is 10HZ.
According to the multi-sensor-based unmanned ship sea surface data acquisition time synchronization method, preferably, the time step t is set to be 0.1 when the frame interpolation processing is performed on the sampled data of the marine radar.
According to the above multi-sensor-based unmanned ship sea surface data acquisition time synchronization method, preferably, in step S2, after each frame of sampling data acquired by each sensor is added with a UTC timestamp of a GPS, visual analysis is performed on the sampling data acquired by each sensor, and whether the time of the sampling data is consistent is verified.
According to the above multi-sensor-based unmanned ship sea surface data acquisition time synchronization method, preferably, the specific operation of performing visual analysis on the sampled data acquired by each sensor is as follows: and respectively projecting the sampling data acquired by the laser radar, the millimeter wave radar, the maritime radar and the GPS sensor onto an image acquired by the photoelectric pod, verifying whether the projection points of the sensor data on the image are consistent with the image information, and synchronizing the data acquisition time of the sensors if the projection points are consistent with the image information.
According to the multi-sensor-based unmanned ship sea surface data acquisition time synchronization method, preferably, when the sampling data acquired by a laser radar, a millimeter wave radar, a maritime radar and a GPS sensor are projected to an image acquired by a photoelectric pod, the following projection formula is adopted:
Figure BDA0004060982600000041
wherein P is the point cloud coordinate before projection, P' is the coordinate projected on the image coordinate system after projection processing, and x is the coordinate of the point cloud coordinate after projection processing 0 ,y 0 ,f x ,f y ,s/f x As an internal reference of the camera, calculating through a calibration tool; i, t and R are projection external parameters, and are obtained by measuring through a measuring tool.
Compared with the prior art, the invention has the following positive and beneficial effects:
the method can ensure the time synchronization of the sampling data acquired by different sensors of the unmanned ship, is favorable for obtaining a high-quality multi-azimuth multi-sensor time synchronization data set, and can more effectively and accurately realize the height positioning and tracking of the unmanned ship and the target.
Drawings
FIG. 1 is a diagram illustrating a network architecture of a PointINet network according to the present invention;
FIG. 2 is a graph of the verification result of the laser radar sampling data of the present invention projected onto an image acquired by the optoelectronic pod;
FIG. 3 is a diagram of the verification result of the data collected by the marine radar and the GPS sensor projected onto the image collected by the optoelectronic pod;
fig. 4 is a diagram of a verification result of the millimeter wave radar and the GPS sensor sampling data projected onto an image acquired by the optoelectronic pod.
Detailed Description
The present invention will be described in further detail by way of the following specific examples, which are not intended to limit the scope of the present invention.
Example 1:
a sea surface data acquisition time synchronization method of an unmanned ship based on multiple sensors comprises the following steps:
s1: setting a fixed sampling frequency based on a data sampling frequency range of an optoelectronic pod, a laser radar, a maritime radar, a millimeter wave radar and a GPS sensor, which are used for acquiring sea surface data, of the unmanned ship; the data sampling frequency of the photoelectric pod is 10Hz, the sampling frequency of the laser radar is 10 Hz-25 Hz, the sampling frequency of the millimeter wave radar is 1 Hz-80 Hz, the sampling frequency of the GPS sensor is 1 Hz-10 Hz, and the sampling frequency of the marine radar is only 1Hz at most due to the ultra-far acting distance of the marine radar, so the fixed sampling frequency is set to be 10Hz. And the data sampling frequencies of the photoelectric pod, the laser radar, the millimeter wave radar and the GPS sensor are all adjusted to be 10Hz, and sampling is carried out. Because the sampling frequency of the marine radar is lower than the fixed sampling frequency, the sampling frequency of the marine radar is set to be 1Hz for sampling, and then the PointINet network is adopted for carrying out frame inserting processing on the sampling data acquired by the marine radar, so that the quantity of the sampling data after frame inserting processing is the same as that of the sensor for carrying out data acquisition according to the fixed sampling frequency.
The PointINet network comprises a point cloud transformation module (Point cloud warping) and a point fusion module (Points fusion), and the architecture diagram of the PointINet network is shown in FIG. 1. The main function of the point cloud transformation module comprises two points, one is to predict a bidirectional 3D scene flow between two continuous frames of point cloud data for motion estimation, the three-dimensional scene flow represents the motion field from one point cloud to the other point cloud, and the 3D scene flow is predicted by adopting a FlowNet3D based scene flow estimation network; and secondly, transforming the point cloud to a given time step according to the predicted linear interpolation 3D scene stream. The point fusion module is used for fusing the two frames of point cloud data by using the point fusion module aiming at every two frames of continuous point cloud data so as to form an intermediate point cloud, thereby completing frame insertion; the point fusion module comprises a self-adaptive sampling sub-module, a self-adaptive KNN clustering sub-module and an attention point fusion sub-module. The point fusion module adaptively samples points in the two transformation point clouds through the adaptive sampling submodule, constructs K nearest neighbor clusters for each sampling point according to time steps through the adaptive KNN clustering submodule so as to adjust contribution of the two point clouds, and then the attention point fusion submodule is adopted to gather the points in each cluster so as to generate an intermediate point cloud.
For point cloud data acquired by marine radar, the specific operation of performing frame interpolation processing by adopting a PointINet network is as follows:
s11: setting P0 and P1 as any two continuous frames of sampling data acquired by radar,
Figure BDA0004060982600000061
for the point cloud obtained by carrying out the point cloud transformation on the P0 frame point cloud according to the time step length t, the method comprises the steps of +.>
Figure BDA0004060982600000062
The point cloud obtained by transforming the P1 frame point cloud according to the time step length t is more than 0 and less than 1; />
Figure BDA0004060982600000063
And the intermediate point cloud data is obtained by performing frame interpolation processing on the P0 frame and P1 frame point cloud data according to the time step t. In order to ensure that the point cloud data acquired by the marine radar can reach 10Hz after frame insertion, a time step t=0.1 is set, and the intermediate point cloud +.>
Figure BDA0004060982600000064
S12: setting the time step of a point cloud conversion module of the PointINet network as t, and inputting the point cloud data of the P0 frame into the point cloud conversion module of the PointINet network to perform point cloud conversion to obtain
Figure BDA0004060982600000065
A frame point cloud; inputting point cloud data of the P1 frame into a point cloud conversion module of the PointINet network to perform point cloud to obtain +.>
Figure BDA0004060982600000066
And (5) a frame point cloud.
Figure BDA0004060982600000067
Frame, & gt>
Figure BDA0004060982600000068
The positions of the frame point clouds are:
Figure BDA0004060982600000069
Figure BDA00040609826000000610
F 0→t =t×F 0→1
F 1→t =(1-t)×F 1→0
F 0→1 representing movement of a point cloud from a P0 frame to
Figure BDA00040609826000000611
Point motion scene flow of frames, F 1→0 Representing the movement of a point cloud from P1 frame to +.>
Figure BDA00040609826000000612
The point motion scene stream of frames.
The key to this step is to estimate every point in the P0 frame from P0 to
Figure BDA00040609826000000613
Estimating the motion of each point in the P1 frame from P1 to
Figure BDA00040609826000000614
Is a motion of (c). The invention assumes that the point motion between two consecutive frames of point clouds is linear, and thus the point clouds move from P0 frame to P0 frame
Figure BDA00040609826000000615
F for point motion scene flow of frame 0→1 Representing that the point cloud is moved from P1 frame to +.>
Figure BDA00040609826000000616
F for point motion scene flow of frame 1→0 Is represented.
S13: will be
Figure BDA00040609826000000617
Frame point cloud and->
Figure BDA00040609826000000618
And the frame point cloud is input into a point fusion module of the PointINet network to be fused, so that an intermediate point cloud Pt between the P0 frame and the P1 frame is generated.
The specific operation is as follows: by adaptive sampling sub-module slave
Figure BDA00040609826000000619
And->
Figure BDA00040609826000000620
Random decimation of N 0 And N 1 Generating two sampling point clouds respectively>
Figure BDA00040609826000000621
And->
Figure BDA00040609826000000622
N 0 =(1-t)×N
N 1 =t×N
N represents
Figure BDA0004060982600000072
The number of the midpoint clouds is N;
then will
Figure BDA0004060982600000073
And->
Figure BDA0004060982600000074
Combining into a new point cloud +.>
Figure BDA0004060982600000075
Then will->
Figure BDA0004060982600000076
Inputting the generated K adjacent points into an attention point fusion submodule, and finally generating an intermediate point cloud +.>
Figure BDA0004060982600000077
S2: and acquiring UTC time of the unmanned ship GPS sensor, and adding a UTC time stamp of the GPS to each frame of sampling data acquired by each sensor.
Since each sensor has its own internal clock, but since the time difference exists between the respective clocks due to the difference in Zhong Piao of each clock, it is necessary to unify the clock sources of the respective sensors. Here, the UTC time of the GPS is used as a uniform time source, the GPS is used to provide reference time for each sensor, each sensor calibrates the respective clock time according to the provided reference time, and each sensor operates as follows according to the respective specific operation already calibrated: analyzing the original GPRMC (recommended positioning information) data acquired by the GPS to obtain the position information of the GPS and UTC time at the current moment. And issuing the obtained UTC time to each sensor (photoelectric pod, laser radar, maritime radar and millimeter wave radar) through the ROS system as a unified time source, and completing the data acquisition time synchronization of each sensor.
Further, in step S2, after adding a UTC timestamp of the GPS to each frame of sampling data collected by each sensor, visual analysis is performed on the sampling data collected by each sensor, so as to verify whether the time of the sampling data is consistent. The specific operation of carrying out visual analysis on the sampling data collected by each sensor is as follows: and respectively projecting the sampling data acquired by the laser radar, the millimeter wave radar, the maritime radar and the GPS sensor onto an image acquired by the photoelectric pod, verifying whether the projection points of the sensor data on the image are consistent with the image information, and synchronizing the data acquisition time of the sensors if the projection points are consistent with the image information.
The method comprises the following steps of projecting sampling data acquired by a laser radar, a millimeter wave radar, a maritime radar and a GPS sensor to an image acquired by a photoelectric pod, wherein the projection formula is as follows:
Figure BDA0004060982600000078
wherein P is the point cloud coordinate before projection, P' is the coordinate projected on the image coordinate system after projection processing, and x is the coordinate of the point cloud coordinate after projection processing 0 ,y 0 ,f x ,f y ,s/f x As an internal reference of the camera, calculating through a calibration tool; i, t and R are projection external parameters, and are obtained by measuring through a measuring tool.
The verification result of projecting the point cloud data acquired by the laser radar onto the image acquired by the optoelectronic pod is shown in fig. 2, and the blue point in fig. 2 is the projection point of the point cloud acquired by the laser radar on the image acquired by the optoelectronic pod. As can be seen from fig. 2, the laser radar point cloud can be projected correctly onto the target of the optoelectronic pod image by projection. The verification result of projecting the point cloud data acquired by the marine radar and the position information acquired by the GPS sensor onto the image acquired by the photoelectric pod is shown in fig. 3; in fig. 3, the green point is the projection point of the marine radar acquisition point cloud on the image acquired by the optoelectronic pod, and the red point is the projection point of the position information acquired by the GPS sensor on the image acquired by the optoelectronic pod. As can be seen from fig. 3, the maritime Lei Dadian cloud can be projected correctly onto the target of the optoelectronic pod image by projection. The verification result of projecting the point cloud data acquired by the millimeter wave radar and the position information acquired by the GPS sensor onto the image acquired by the optoelectronic pod is shown in fig. 4; in fig. 4, the green dot is the projection point of the millimeter wave radar acquisition point cloud on the image acquired by the optoelectronic pod, and the red dot is the projection point of the position information acquired by the GPS sensor on the image acquired by the optoelectronic pod. As can be seen from fig. 4, the millimeter wave Lei Dadian cloud can be projected correctly onto the target of the optoelectronic pod image by projection. The correct projection verification result shows that after the processing is carried out by adopting the multi-sensor-based unmanned ship sea surface data acquisition time synchronization method, the time points of the acquisition data of the sensors are synchronized.
The above description is merely illustrative of the preferred embodiments of the present invention and is not intended to limit the invention to the particular embodiments disclosed, but on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the invention.

Claims (10)

1. The sea surface data acquisition time synchronization method of the unmanned ship based on the multiple sensors is characterized by comprising the following steps of:
s1: setting a fixed sampling frequency based on a data sampling frequency range of each sensor used for acquiring sea surface data of the unmanned ship, adjusting the sampling frequency of each sensor to be consistent with the fixed sampling frequency, and sampling by adopting the adjusted sensors; for the sensor with the sampling frequency range lower than the fixed sampling frequency, sampling according to the original sampling frequency of the sensor, and then carrying out frame inserting processing on the acquired sampling data, so that the number of the sampling data after frame inserting processing is the same as that of the sensor for carrying out data acquisition according to the fixed sampling frequency;
s2: and acquiring UTC time of the unmanned ship GPS sensor, and adding a UTC time stamp of the GPS to each frame of sampling data acquired by each sensor.
2. The multi-sensor-based unmanned ship sea surface data acquisition time synchronization method according to claim 1, wherein in step S1, a PointINet is adopted to perform frame interpolation processing on the acquired sampled data; the PointINet network comprises a point cloud conversion module and a point fusion module, wherein the point cloud conversion module is used for predicting a bidirectional 3D scene flow between two continuous frames of point cloud data and converting the point cloud to a given time step according to the predicted 3D scene flow; the point fusion module comprises a self-adaptive sampling submodule, a self-adaptive KNN clustering submodule and an attention point fusion submodule.
3. The multi-sensor-based unmanned ship sea surface data acquisition time synchronization method according to claim 2, wherein the specific operation of performing frame interpolation processing on the acquired sampled data by using a PointINet network is as follows:
s11: p0 and P1 are set as any two continuous frames of sampling data acquired by the sensor,
Figure FDA0004060982590000011
for the point cloud obtained by carrying out the point cloud transformation on the P0 frame point cloud according to the time step length t, the method comprises the steps of +.>
Figure FDA0004060982590000012
In order to transform the P1 frame point cloud according to the time step length t, the point cloud is more than 0 and less than 1,
Figure FDA0004060982590000013
the method comprises the steps that intermediate point cloud data are obtained after the point cloud data of P0 frames and P1 frames are subjected to frame interpolation processing according to a time step t;
s12: setting the time step of a point cloud conversion module of the PointINet network as t, and inputting the point cloud data of the P0 frame into the point cloud conversion module of the PointINet network to perform point cloud conversion to obtain
Figure FDA0004060982590000014
A frame point cloud; inputting point cloud data of the P1 frame into a point cloud conversion module of the PointINet network to perform point cloud to obtain +.>
Figure FDA0004060982590000015
A frame point cloud;
s13: will be
Figure FDA0004060982590000016
Frame point cloud and->
Figure FDA0004060982590000017
And the frame point cloud is input into a point fusion module of the PointINet network to be fused, so that an intermediate point cloud Pt between the P0 frame and the P1 frame is generated.
4. The method for synchronizing sea surface data acquisition time of the multi-sensor-based unmanned ship according to claim 3, wherein, in step S12,
Figure FDA0004060982590000018
frame, & gt>
Figure FDA0004060982590000019
The positions of the frame point clouds are:
Figure FDA0004060982590000021
Figure FDA0004060982590000022
F 0→t =t×F 0→1
F 1→t =(1-t)×F 1→0
F 0→1 representing movement of a point cloud from a P0 frame to
Figure FDA0004060982590000023
Point motion scene flow of frames, F 1→0 Representing the movement of a point cloud from P1 frame to +.>
Figure FDA0004060982590000024
The point motion scene stream of frames.
5. The multi-sensor based unmanned ship sea surface data acquisition time synchronization method of claim 4, wherein the sensor comprises a photoelectric pod, a laser radar, a marine radar, a millimeter wave radar, a GPS sensor.
6. The multi-sensor-based unmanned ship sea surface data acquisition time synchronization method according to claim 5, wherein in step S1, the sampling frequency of the optoelectronic pod, the laser radar, the millimeter wave radar and the GPS sensor is adjusted to a fixed sampling frequency; and the sampling frequency of the marine radar is lower than the fixed sampling frequency, and the frame inserting processing is carried out on the sampling data of the marine radar.
7. The multi-sensor based unmanned boat sea surface data acquisition time synchronization method of claim 6, wherein the fixed sampling frequency is 10HZ.
8. The method for synchronizing sea surface data acquisition time of the unmanned ship based on the multiple sensors according to claim 7, wherein in step S2, after each frame of sampling data acquired by each sensor is added with a UTC timestamp of a GPS, visual analysis is further performed on the sampling data acquired by each sensor, and whether the time of the sampling data is consistent is verified.
9. The multi-sensor-based unmanned ship sea surface data acquisition time synchronization method of claim 8, wherein the specific operation of performing visual analysis on the sampled data acquired by each sensor is as follows: and respectively projecting the sampling data acquired by the laser radar, the millimeter wave radar, the maritime radar and the GPS sensor onto an image acquired by the photoelectric pod, verifying whether the projection points of the sensor data on the image are consistent with the image information, and synchronizing the data acquisition time of the sensors if the projection points are consistent with the image information.
10. The multi-sensor-based unmanned ship sea surface data acquisition time synchronization method according to claim 9, wherein when the sampled data acquired by the laser radar, the millimeter wave radar, the maritime radar and the GPS sensor are projected to the image acquired by the optoelectronic pod, the following projection formula is adopted:
Figure FDA0004060982590000031
wherein P is the point cloud coordinate before projection, P' is the coordinate projected on the image coordinate system after projection processing, and x is the coordinate of the point cloud coordinate after projection processing 0 ,y 0 ,f x ,f y ,s/f x Is an internal parameter of the camera, and I, t and R are external parameters of projection.
CN202310059293.XA 2023-01-17 2023-01-17 Multi-sensor-based unmanned ship sea surface data acquisition time synchronization method Pending CN116233782A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310059293.XA CN116233782A (en) 2023-01-17 2023-01-17 Multi-sensor-based unmanned ship sea surface data acquisition time synchronization method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310059293.XA CN116233782A (en) 2023-01-17 2023-01-17 Multi-sensor-based unmanned ship sea surface data acquisition time synchronization method

Publications (1)

Publication Number Publication Date
CN116233782A true CN116233782A (en) 2023-06-06

Family

ID=86574263

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310059293.XA Pending CN116233782A (en) 2023-01-17 2023-01-17 Multi-sensor-based unmanned ship sea surface data acquisition time synchronization method

Country Status (1)

Country Link
CN (1) CN116233782A (en)

Similar Documents

Publication Publication Date Title
He et al. An integrated GNSS/LiDAR-SLAM pose estimation framework for large-scale map building in partially GNSS-denied environments
CN104215239B (en) Guidance method using vision-based autonomous unmanned plane landing guidance device
CN113359097B (en) Millimeter wave radar and camera combined calibration method
CN110675418A (en) Target track optimization method based on DS evidence theory
CN104197928B (en) Multi-camera collaboration-based method for detecting, positioning and tracking unmanned aerial vehicle
CN110689562A (en) Trajectory loop detection optimization method based on generation of countermeasure network
CN109949372A (en) A kind of laser radar and vision combined calibrating method
GB2620877A (en) On-board positioning device-based roadside millimeter-wave radar calibration method
WO2023131123A1 (en) External parameter calibration method and apparatus for combined navigation device and laser radar
US7149346B2 (en) Three-dimensional database generating system and method for generating three-dimensional database
RU2364887C2 (en) Method for navigation of aircraft by radar images of earth surface with application of digital area models
AU2013343222A1 (en) Cloud feature detection
WO2017030616A2 (en) Fast scanning radar systems and methods
CN105352509A (en) Unmanned aerial vehicle motion target tracking and positioning method under geographic information space-time constraint
CN110208771A (en) A kind of point cloud intensity correcting method of mobile two-dimensional laser radar
EP3175312A2 (en) Video-assisted landing guidance system and method
CN115855079A (en) Time asynchronous perception sensor fusion method
CN115728803A (en) System and method for continuously positioning urban driving vehicle
WO2021070814A1 (en) Synchronization device, synchronization method, and synchronization program
CN117518196A (en) Motion compensation method, device, system, equipment and medium for laser radar
CN116233782A (en) Multi-sensor-based unmanned ship sea surface data acquisition time synchronization method
CN116863382A (en) Expressway multi-target tracking method based on radar fusion
CN111142098A (en) Dynamic three-dimensional speed measurement system and method based on UWB technology
RU2747325C2 (en) System and method of relative position information definition and a no-short-time computer-readable data carrier
CN116413706A (en) Method for simultaneously establishing graph and calibrating internal reference of laser radar on mobile carrier

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination