CN116184376A - Underwater three-dimensional terrain and multi-beam image sonar data simulation system and method - Google Patents
Underwater three-dimensional terrain and multi-beam image sonar data simulation system and method Download PDFInfo
- Publication number
- CN116184376A CN116184376A CN202211467131.1A CN202211467131A CN116184376A CN 116184376 A CN116184376 A CN 116184376A CN 202211467131 A CN202211467131 A CN 202211467131A CN 116184376 A CN116184376 A CN 116184376A
- Authority
- CN
- China
- Prior art keywords
- image
- sonar
- underwater
- dimensional
- simulation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000004088 simulation Methods 0.000 title claims abstract description 85
- 238000000034 method Methods 0.000 title claims abstract description 28
- 238000001514 detection method Methods 0.000 claims abstract description 39
- 238000012876 topography Methods 0.000 claims abstract description 38
- 238000010276 construction Methods 0.000 claims description 31
- 238000005259 measurement Methods 0.000 claims description 18
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims description 11
- 238000004422 calculation algorithm Methods 0.000 claims description 7
- 230000033001 locomotion Effects 0.000 claims description 7
- 238000013507 mapping Methods 0.000 claims description 6
- 238000003709 image segmentation Methods 0.000 claims description 5
- 238000001914 filtration Methods 0.000 claims description 4
- 230000004927 fusion Effects 0.000 claims description 4
- 238000012545 processing Methods 0.000 claims description 3
- 238000012549 training Methods 0.000 claims description 3
- 238000005282 brightening Methods 0.000 claims description 2
- 230000009466 transformation Effects 0.000 claims description 2
- 238000005516 engineering process Methods 0.000 abstract description 4
- 230000003287 optical effect Effects 0.000 description 4
- 238000013508 migration Methods 0.000 description 3
- 230000005012 migration Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 230000016776 visual perception Effects 0.000 description 2
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 description 1
- 238000010521 absorption reaction Methods 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000002238 attenuated effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 230000001902 propagating effect Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/539—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/20—Design optimisation, verification or simulation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/05—Geographic models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4007—Scaling of whole images or parts thereof, e.g. expanding or contracting based on interpolation, e.g. bilinear interpolation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/20—Image enhancement or restoration using local operators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10044—Radar image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30241—Trajectory
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/30—Assessment of water resources
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Geometry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Remote Sensing (AREA)
- Software Systems (AREA)
- Radar, Positioning & Navigation (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- Multimedia (AREA)
- Computer Networks & Wireless Communication (AREA)
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
Abstract
The invention discloses a simulation method and a simulation system for underwater three-dimensional topography and multi-beam sounding sonar, which belong to the technical field of underwater detection and can be widely applied to the fields of three-dimensional technology and sonar images; the data simulation unit simulates navigation data of the underwater vehicle and simulates a detection mode of the multi-beam image sonar; the first model building unit mainly builds a target simulation model and underwater three-dimensional point cloud terrain, the second model building unit mainly builds a sonar image simulation model and an original simulation sonar image, the third model unit mainly builds a sonar image style model and a final sonar image, and the units ensure that the system can complete three-dimensional scene simulation and multi-beam image sonar data simulation in a changeable underwater environment to generate a complete and clear sonar image.
Description
Technical Field
The invention relates to an underwater three-dimensional terrain and multi-beam image sonar data simulation system and method, and belongs to the technical field of underwater detection.
Technical Field
In recent years, three-dimensional reconstruction of underwater scenes plays an important role in engineering projects such as marine resource development, underwater emergency rescue and the like, and is gradually becoming a research focus in the field of underwater detection. The current underwater detection mode mainly comprises an acoustic detection mode and an optical detection mode. Although the optical image detected by the optical visual perception technology can cover various color information and structural information in the detection range, as dissolved and suspended substances in the water body have very strong absorption and scattering effects on light, the light is exponentially attenuated when propagating underwater, and the visibility of the optical image is even less than ten meters in some turbid water areas, so that the perception range is severely limited. The acoustic visual perception technology based on sonar is widely used in the field of underwater detection because the wave length of sound waves is longer and the propagation distance under water is longer.
The mode of carrying the multi-beam sonar by using the underwater vehicle is one of the most common detection modes of underwater detection at present. However, due to the fact that the underwater vehicle and the sonar are high in price, the underwater environment is complex and changeable, underwater data are difficult to obtain, and progress of research in the aspects of underwater exploration is severely limited. Therefore, the simulation of underwater three-dimensional terrain and sonar images has very important significance.
Disclosure of Invention
In order to solve the technical problems, the invention provides a system and a method for simulating three-dimensional scene under water and multi-beam image sonar data, which can complete three-dimensional scene simulation and multi-beam image sonar data simulation under complex and changeable underwater environments and generate complete and clear sonar images.
The technical scheme for realizing the invention is as follows:
an underwater three-dimensional terrain and multibeam image sonar data simulation system, comprising: the system comprises an acquisition unit, a first model construction unit, a second model construction unit, a third model construction unit and a data simulation unit;
the acquisition unit is used for acquiring a two-dimensional sonar image of the underwater topography of the high water by using the synthetic aperture sonar in the acquisition unit, converting the two-dimensional sonar image into a gray level image, and acquiring an actual measurement multi-beam sonar image dataset of the underwater topography by using the multi-beam sonar in the acquisition unit;
the first model construction unit is used for constructing underwater three-dimensional point cloud terrains based on the two-dimensional sonar images;
the data simulation unit simulates navigation data of the underwater vehicle and a detection mode of the simulated multibeam image sonar to obtain a simulation result;
the second model construction unit is used for constructing a sonar image simulation model based on the simulation result and the underwater three-dimensional point cloud terrain to obtain an original simulated sonar image;
and the third model construction unit is used for constructing a sonar image style model based on the underwater topography actual measurement multi-beam sonar image dataset, and bringing the original simulation sonar image into the sonar image style model to construct a final sonar image.
Further, the specific method for obtaining the original simulated sonar image comprises the following steps:
obtaining a motion track of the underwater vehicle under water according to the simulation, and bringing a simulated multibeam image sonar detection mode into the motion track; three-dimensional transformation is carried out according to the simulation orientation angle, the nearest neighbor point of the detection ray of the simulation multi-beam image sonar and the simulation three-dimensional point cloud terrain is found, and the distance between the detection point of each ray and the simulation multi-beam image sonar is obtained; determining the width of the simulated two-dimensional multi-beam image sonar image according to the detection transverse open angle of the simulated multi-beam image sonar, and determining the height of the simulated two-dimensional multi-beam image sonar image and the distance represented by one pixel point in a column pixel according to the detection distance of the simulated multi-beam image sonar; and bringing the distance between the simulated detected target point and the simulated multi-beam image sonar into the two-dimensional image to generate an initial simulated sonar image.
Further, the detection mode of the simulated multi-beam image sonar comprises the following specific steps:
and determining a longitudinal open angle and a transverse open angle of the simulated multi-beam sonar, sending out a ray every 1 DEG at the transverse open angle by taking the simulated multi-beam sonar as an origin, and sending out a ray every 0.1 DEG at the longitudinal open angle.
Further, the voyage data includes: timestamp data, voyage azimuth data, attitude angle data of the underwater vehicle, depth data of the underwater vehicle, and voyage speed data.
An underwater three-dimensional terrain and multi-beam image sonar data simulation method comprises the following steps:
step 2, a first model building unit builds a target simulation model according to the KLSG data set and the first gray level image, and obtains an underwater two-dimensional terrain image;
step 3, a first model construction unit constructs underwater three-dimensional point cloud terrain according to the underwater two-dimensional terrain image;
step 4, simulating navigation data of the underwater vehicle and a multibeam image sonar detection mode by a data simulation unit to obtain a simulation result;
step 5, a second model construction unit combines the simulation result and the underwater three-dimensional point cloud terrain to construct a sonar image simulation model, so as to obtain an original simulated sonar image;
and 6, acquiring a group of underwater topography actual measurement multi-beam sonar image data sets by the multi-beam sonar in the acquisition unit, constructing a sonar image style model by the third model construction unit according to the underwater topography actual measurement multi-beam sonar image data sets, and bringing the original simulation sonar image into the sonar image style model to obtain a final sonar image.
Further, the first model construction unit establishes a target simulation model according to the KLSG data set and the first gray scale image, and obtains an underwater two-dimensional terrain image, and the specific method comprises the following steps:
randomly selecting an image from the KLSG data set as an image to be processed, and converting the image into a gray space to be used as a second gray image; image segmentation is carried out on the second gray level image, and binarization processing is carried out, so that a binarization foreground image is obtained; combining the binarized foreground image serving as a mask with an image to be processed to obtain a foreground image of a target area; bringing the binarized foreground image into a CANNY algorithm to obtain edge information of the binarized foreground image; and brightening the foreground image and randomly adding the foreground image into the first gray level image to obtain an initial fusion image, and performing Gaussian filtering in the initial fusion image along the image edge according to the edge information of the binarized foreground image to generate a final underwater topography two-dimensional image.
Further, the method for constructing the underwater three-dimensional point cloud terrain according to the underwater two-dimensional terrain image comprises the following steps:
according to the simulation demand, setting the number of point clouds corresponding to the image pixel points, respectively using the abscissa and the ordinate of the pixel points of the image to correspond to the abscissa and the ordinate of the three-dimensional point cloud, extracting the gray value of the two-dimensional topographic image as the height of the three-dimensional point cloud, and establishing the underwater three-dimensional point cloud topography, wherein the specific mapping relation is shown in the following formula:
wherein: x is X map Is the abscissa of the three-dimensional point cloud; y is Y map Is the ordinate of the three-dimensional point cloud; z is Z map The height of the three-dimensional point cloud; x is X img Is the abscissa of the image, Y img As the ordinate of the image,gray values corresponding to x and y coordinates in the image; n and m are adjustment coefficients of three-dimensional terrain;
after generating an initial three-dimensional point cloud, interpolation is carried out between adjacent point clouds by using a linear interpolation method, wherein the interpolation is specifically shown as the following formula:
wherein:is the abscissa of the jth point cloud; />Is the j-th point cloud ordinate, +.>And l is the difference quantity of the adjacent point clouds, wherein l is the height of the jth point cloud.
Further, the multi-beam sonar in the acquisition unit acquires a group of underwater topography actual measurement multi-beam sonar image data sets, and the third model construction unit constructs a sonar image style model according to the underwater topography actual measurement multi-beam sonar image data sets, and obtains a final sonar image, and the specific method is as follows:
image segmentation is carried out on a group of images in the underwater topography actual measurement multi-beam sonar image dataset acquired by the multi-beam sonar in the acquisition unit, an image foreground and an image background are separated, an image background dataset is generated, and a sonar image style model is constructed according to the image background dataset; carrying the image background data set into a sonar image style model for training; and carrying out Gaussian blur on the simulated original sonar image, and carrying the Gaussian blurred original sonar image into a sonar image style model to generate a final simulated sonar image.
The beneficial effects are that:
1. according to the invention, the acquisition unit, the first model construction unit, the second model construction unit, the third model construction unit and the data simulation unit are arranged to cooperatively work, and the five units can complete multiple simulations and improve the definition of a final sonar image by combining an image style migration technology only by acquiring the underwater topography high-definition two-dimensional sonar image once through the synthetic aperture sonar, so that the high cost for acquiring the underwater topography high-definition two-dimensional sonar image by using the synthetic aperture sonar for multiple times is reduced.
2. And when the second model construction unit constructs the original simulated sonar image, the simulation result of the data simulation unit is combined, so that the accuracy of the original simulated sonar image is ensured.
3. The longitudinal open angle and the transverse open angle of the true multi-beam sonar are used for emitting a ray every 1 DEG at the transverse open angle by taking the simulated multi-beam sonar as an original point, and emitting a ray every 0.1 DEG at the longitudinal open angle, so that the definition and the accuracy of the subsequent image generation are ensured.
4. According to the simulation demand, the number of point clouds corresponding to the image pixel points is set, the abscissa and the ordinate of the pixel points of the image correspond to the abscissa and the ordinate of the three-dimensional point cloud respectively, the gray value of the two-dimensional topographic image is extracted as the height of the three-dimensional point cloud, the underwater three-dimensional point cloud topography is built, the precision of converting the two-dimensional topographic image into the three-dimensional point cloud topography is improved, and the definition and accuracy of later imaging are ensured.
Drawings
FIG. 1 is a flow chart of a method for simulating underwater three-dimensional terrain and multi-beam image sonar data;
FIG. 2 is a schematic view of a simulated three-dimensional point cloud topography;
FIG. 3 is a schematic diagram of a simulated multibeam image sonar detection process in a simulated three-dimensional terrain;
fig. 4 is a simulated multibeam sonar image.
Detailed Description
The invention will be further described with reference to the drawings and specific examples.
The embodiment provides an underwater three-dimensional terrain and multi-beam image sonar data simulation system and method (as shown in figure 1) by combining actual conditions, wherein the simulation system comprises the following contents:
an underwater three-dimensional terrain and multibeam image sonar data simulation system, comprising: the system comprises an acquisition unit, a first model construction unit, a second model construction unit, a third model construction unit and a data simulation unit; wherein each unit functions as follows:
the acquisition unit is used for acquiring a two-dimensional sonar image of the topography under the high-water surface by using the synthetic aperture sonar in the acquisition unit and converting the two-dimensional sonar image into a gray level image, and acquiring an actually measured multi-beam sonar image data set by using the multi-beam sonar in the acquisition unit;
the first model building unit is used for building a target simulation model and building underwater three-dimensional point cloud terrain;
the second model building unit builds a sonar image simulation model;
the third model construction unit is used for constructing a sonar image style model;
and the data simulation unit simulates navigation data of the underwater vehicle and simulates a detection mode of the multi-beam image sonar.
An underwater three-dimensional terrain and multi-beam image sonar data simulation method comprises the following specific steps:
it should be noted that the obtained two-dimensional sonar image of the topography under the high water is generally a synthetic aperture sonar detection image. And turning the acquired synthetic aperture sonar image into a gray space, wherein the conversion formula is as follows:
Gray=0.229*R+0.587*G+0.114*B (1)
step 2, a first model building unit builds a target simulation model according to the KLSG data set and the first gray level image, and obtains an underwater two-dimensional terrain image;
firstly, randomly selecting an image to be processed from a KLSG image dataset as a target image, dividing the image by using a K-Means algorithm, performing binarization analysis, and selecting a foreground image with small area as a target ship or a target plane according to the image characteristics of the dataset;
processing the binarized image: on the one hand, extracting a target of the target image by taking the target image as a mask; and on the other hand, carrying the binarized image into a channel algorithm to obtain the edge information of the target image.
Randomly superposing the target image into a first gray level image according to the size of the target image, carrying out Gaussian filtering treatment on the target edge of the superposed image according to the edge information of the target image, and finally generating a simulated two-dimensional topographic image, wherein the size of a filtering kernel is 3 multiplied by 3, and the target edge and the background are used for blurring;
step 3, a first model building unit builds underwater three-dimensional point cloud terrains (as shown in figure 2) according to the underwater two-dimensional terrains;
the specific method for establishing the underwater three-dimensional point cloud terrain by using the simulated two-dimensional terrain image comprises the following steps: mapping the abscissa of the image into a three-dimensional space, and mapping the gray value of the image into the height of a three-dimensional point cloud, wherein the specific mapping relation is shown in a formula (2):
wherein: x is X map Is the abscissa of the three-dimensional point cloud; y is Y map Is the ordinate of the three-dimensional point cloud; z is Z map The height of the three-dimensional point cloud; x is X img Is the abscissa of the image, Y img As the ordinate of the image,gray values corresponding to x and y coordinates in the image; n and m are adjustment coefficients of three-dimensional terrain, in the example, the value of n is 100, the value of m is 0.01, and the specific value is according to the actual simulation requirement;
after the initial three-dimensional point cloud is generated, interpolation is performed between adjacent point clouds by using a linear interpolation method, in this example, the interpolation number of the adjacent point clouds is 99, which is specifically shown in formula (3):
wherein:is the abscissa of the jth point cloud; />Is the j-th point cloud ordinate, +.>Is the height of the jth point cloud.
Step 4, simulating navigation data of the underwater vehicle and a multibeam image sonar detection mode by a data simulation unit to obtain a simulation result;
in the present embodiment, the navigation data includes time stamp data T, navigation azimuth data Cog, underwater vehicle attitude angle data (heading, pith, roll), depth data set Depth of the underwater vehicle, navigation speed data v;
the multi-beam image sonar detection mode is as follows: the longitudinal opening angle and the transverse opening angle of the simulated multi-beam sonar are determined, in the embodiment, the set transverse opening angle is 80 degrees, the longitudinal opening angle is 1 degree, namely, the three-dimensional space detected by the multi-beam sonar is determined, a ray is emitted every 0.15 degree by taking the simulated multi-beam sonar as an origin at the transverse opening angle, and a ray is emitted every 0.1 degree at the longitudinal opening angle.
Step 5, a second model construction unit combines the simulation result and the underwater three-dimensional point cloud terrain to construct a sonar image simulation model, so as to obtain an original simulated sonar image;
according to the simulation, a motion track of the underwater vehicle under the water is obtained, and a simulation multi-beam image sonar detection mode is brought into the motion track, wherein a schematic diagram of a detection process of the simulation multi-beam image sonar in a simulated three-dimensional terrain is shown in fig. 3;
preprocessing the simulation result, wherein the timestamp data T is the recording time of each item of data, and the time interval of each data simulation can be calculated by using TPartition set T 1 As shown in formulas (4) and (5):
T={t 1 ,t 2 ,t 3 ,…,t n-2 ,t n-1 ,t n } (4)
T 1 ={t 2 -t 1 ,t 3 -t 2 ,…,t n-1 -t n-2 ,t n -t n-1 } (5)
the navigation speed data V is simulated instantaneous speed, and the average speed set V of the data acquisition time interval can be calculated according to the approximation of V 1 As shown in formulas (6) and (7):
V={v 1 ,v 2 ,v 3 ,…,v n-2 ,v n-1 ,v n } (6)
the Depth difference data set Depth of each Depth and the previous Depth can be calculated according to the Depth data set Depth 1 As shown in formulas (8) and (9):
Depth={d 1 ,d 2 ,d 3 ,…,d n-2 ,d n-1 ,d n } (8)
Depth 1 ={d 2 -d 1 ,d 3 -d 2 ,…,d n-1 -d n-2 ,d n -d n-1 } (9)
using data time interval data sets T 1 And a set of interval average speeds V 1 The movement distance data set D of the aircraft in the time interval can be calculated, and the calculation formula is shown in the formula (10):
establishing a rotation matrix R according to the carrier attitude angle data, wherein the rotation matrix is shown in a formula (11):
wherein: h. p and r represent elements in the Heading, pitch, roll dataset, respectively.
Constructing a sonar image simulation model according to the algorithm, and taking a simulated multibeam sonar image detection mode into the constructed sonar image simulation model; note that, since the ray sent by the simulation does not necessarily intersect with a certain point in the three-dimensional point cloud map, in this embodiment, only the point cloud of the three-dimensional point cloud map and the point cloud closest to the sonar detection line sent by the simulation are calculated as the target point cloud.
And recording the distance between the target point cloud and the aircraft, and setting the detection distance of sonar detection and the mapping relation in the sonar image. In this embodiment, the simulated sonar detection distance is 10m, and the simulated sonar image height is 1000 pixels. And generating an original simulation sonar image according to the ratio of the distance between the target point cloud and the aircraft and the sonar detection distance, and randomly adding plaque noise into the original simulation sonar image.
Step 6, acquiring a group of underwater topography actual measurement multi-beam sonar image data sets by the multi-beam sonar in the acquisition unit, and constructing a sonar image style model by the third model construction unit according to the underwater topography actual measurement multi-beam sonar image data sets to obtain a final sonar image;
in the embodiment, the acquired underwater topography actual measurement multibeam sonar image dataset is a two-dimensional image sonar of BlueView MB2250-W-DL model, which is obtained by detecting the real underwater topography;
in this embodiment, the sonar image style model construction process specifically includes: image segmentation is carried out on an image (as shown in fig. 4) in the actual measurement data set, an image foreground and an image background are separated, an image foreground target is removed, an image background data set is generated, and a sonar image style model is built according to the data set; the data set is carried into a sonar image style model constructed based on an offline model optimized rapid image stylized migration algorithm for training, and a single-style single migration algorithm is used in the embodiment;
performing Gaussian blur on the simulated original sonar image; and carrying the Gaussian blurred original sonar image into a sonar image style model to generate a final simulated sonar image.
The step numbers in the above method embodiments are set for convenience of illustration, and the order of steps is not limited in any way, and the execution order of each step in the embodiments may be adaptively adjusted according to the understanding of those skilled in the art.
While the preferred embodiment of the present invention has been described in detail, the present invention is not limited to the embodiments described above, and various equivalent modifications and substitutions can be made by those skilled in the art without departing from the spirit of the present invention, and these equivalent modifications and substitutions are intended to be included in the scope of the present invention as defined in the appended claims.
Claims (8)
1. An underwater three-dimensional terrain and multi-beam image sonar data simulation system is characterized by comprising: the system comprises an acquisition unit, a first model construction unit, a second model construction unit, a third model construction unit and a data simulation unit;
the acquisition unit is used for acquiring a two-dimensional sonar image of the underwater topography of the high water by using the synthetic aperture sonar in the acquisition unit, converting the two-dimensional sonar image into a gray level image, and acquiring an actual measurement multi-beam sonar image dataset of the underwater topography by using the multi-beam sonar in the acquisition unit;
the first model construction unit is used for constructing underwater three-dimensional point cloud terrains based on the two-dimensional sonar images;
the data simulation unit simulates navigation data of the underwater vehicle and simulates a detection mode of the multi-beam image sonar;
the second model construction unit is used for constructing a sonar image simulation model based on the simulation result and the underwater three-dimensional point cloud terrain to obtain an original simulated sonar image;
and the third model construction unit is used for constructing a sonar image style model based on the underwater topography actual measurement multi-beam sonar image dataset, and bringing the original simulation sonar image into the sonar image style model to construct a final sonar image.
2. The system of claim 1, wherein the method for obtaining the original simulated sonar image comprises the following steps:
obtaining a motion track of the underwater vehicle under water according to the simulation, and bringing a simulated multibeam image sonar detection mode into the motion track; three-dimensional transformation is carried out according to the simulation orientation angle, the nearest neighbor point of the detection ray of the simulation multi-beam image sonar and the simulation three-dimensional point cloud terrain is found, and the distance between the detection point of each ray and the simulation multi-beam image sonar is obtained; determining the width of the simulated two-dimensional multi-beam image sonar image according to the detection transverse open angle of the simulated multi-beam image sonar, and determining the height of the simulated two-dimensional multi-beam image sonar image and the distance represented by one pixel point in a column pixel according to the detection distance of the simulated multi-beam image sonar; and bringing the distance between the simulated detected target point and the simulated multi-beam image sonar into the two-dimensional image to generate an initial simulated sonar image.
3. The system of claim 1, wherein the method for detecting the simulated multi-beam image sonar is as follows:
and determining a longitudinal open angle and a transverse open angle of the simulated multi-beam sonar, sending out a ray every 1 DEG at the transverse open angle by taking the simulated multi-beam sonar as an origin, and sending out a ray every 0.1 DEG at the longitudinal open angle.
4. The system of claim 1 or 2, wherein the voyage data comprises: timestamp data, voyage azimuth data, attitude angle data of the underwater vehicle, depth data of the underwater vehicle, and voyage speed data.
5. The underwater three-dimensional terrain and multi-beam image sonar data simulation method is characterized by comprising the following steps of:
step 1, a synthetic aperture sonar in an acquisition unit acquires an underwater topography high-definition two-dimensional sonar image and converts the underwater topography high-definition two-dimensional sonar image into a gray space to serve as a first gray image;
step 2, a first model building unit builds a target simulation model according to the KLSG data set and the first gray level image, and obtains an underwater two-dimensional terrain image;
step 3, a first model construction unit constructs underwater three-dimensional point cloud terrain according to the underwater two-dimensional terrain image;
step 4, simulating navigation data of the underwater vehicle and a multibeam image sonar detection mode by a data simulation unit to obtain a simulation result;
step 5, a second model construction unit combines the simulation result and the underwater three-dimensional point cloud terrain to construct a sonar image simulation model, so as to obtain an original simulated sonar image;
and 6, acquiring a group of underwater topography actual measurement multi-beam sonar image data sets by the multi-beam sonar in the acquisition unit, constructing a sonar image style model by the third model construction unit according to the underwater topography actual measurement multi-beam sonar image data sets, and bringing the original simulation sonar image into the sonar image style model to obtain a final sonar image.
6. The method of claim 5, wherein the first model building unit builds the target simulation model according to the KLSG dataset and the first gray scale image, and obtains the underwater two-dimensional terrain image by:
randomly selecting an image from the KLSG data set as an image to be processed, and converting the image into a gray space to be used as a second gray image; image segmentation is carried out on the second gray level image, and binarization processing is carried out, so that a binarization foreground image is obtained; combining the binarized foreground image serving as a mask with an image to be processed to obtain a foreground image of a target area; bringing the binarized foreground image into a CANNY algorithm to obtain edge information of the binarized foreground image; and brightening the foreground image and randomly adding the foreground image into the first gray level image to obtain an initial fusion image, and performing Gaussian filtering in the initial fusion image along the image edge according to the edge information of the binarized foreground image to generate a final underwater topography two-dimensional image.
7. The method of claim 5, wherein the constructing the underwater three-dimensional point cloud terrain from the underwater two-dimensional terrain image comprises:
according to the simulation demand, setting the number of point clouds corresponding to the image pixel points, respectively using the abscissa and the ordinate of the pixel points of the image to correspond to the abscissa and the ordinate of the three-dimensional point cloud, extracting the gray value of the two-dimensional topographic image as the height of the three-dimensional point cloud, and establishing the underwater three-dimensional point cloud topography, wherein the specific mapping relation is shown in the following formula:
wherein: x is X map Is the abscissa of the three-dimensional point cloud; y is Y map Is the ordinate of the three-dimensional point cloud; z is Z map The height of the three-dimensional point cloud; x is X img Is the abscissa of the image, Y img As the ordinate of the image,gray values corresponding to x and y coordinates in the image; n and m are adjustment coefficients of three-dimensional terrain;
after generating an initial three-dimensional point cloud, interpolation is carried out between adjacent point clouds by using a linear interpolation method, wherein the interpolation is specifically shown as the following formula:
8. The method according to claim 5, wherein the multi-beam sonar in the obtaining unit obtains a set of underwater topography actual measurement multi-beam sonar image data sets, and the third model building unit builds a sonar image style model according to the underwater topography actual measurement multi-beam sonar image data sets, so as to obtain a final sonar image, and the specific method is that:
image segmentation is carried out on a group of images in the underwater topography actual measurement multi-beam sonar image dataset acquired by the multi-beam sonar in the acquisition unit, an image foreground and an image background are separated, an image background dataset is generated, and a sonar image style model is constructed according to the image background dataset; carrying the image background data set into a sonar image style model for training; and carrying out Gaussian blur on the simulated original sonar image, and carrying the Gaussian blurred original sonar image into a sonar image style model to generate a final simulated sonar image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211467131.1A CN116184376A (en) | 2022-11-22 | 2022-11-22 | Underwater three-dimensional terrain and multi-beam image sonar data simulation system and method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211467131.1A CN116184376A (en) | 2022-11-22 | 2022-11-22 | Underwater three-dimensional terrain and multi-beam image sonar data simulation system and method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116184376A true CN116184376A (en) | 2023-05-30 |
Family
ID=86439080
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211467131.1A Pending CN116184376A (en) | 2022-11-22 | 2022-11-22 | Underwater three-dimensional terrain and multi-beam image sonar data simulation system and method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116184376A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117452417A (en) * | 2023-12-14 | 2024-01-26 | 北京星天科技有限公司 | Cluster type underwater topography measurement system and method |
-
2022
- 2022-11-22 CN CN202211467131.1A patent/CN116184376A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117452417A (en) * | 2023-12-14 | 2024-01-26 | 北京星天科技有限公司 | Cluster type underwater topography measurement system and method |
CN117452417B (en) * | 2023-12-14 | 2024-03-08 | 北京星天科技有限公司 | Cluster type underwater topography measurement system and method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110675418B (en) | Target track optimization method based on DS evidence theory | |
CN112257605B (en) | Three-dimensional target detection method, system and device based on self-labeling training sample | |
CN111241970A (en) | SAR image sea surface ship detection method based on yolov3 algorithm and sliding window strategy | |
CN110706177B (en) | Method and system for equalizing gray level of side-scan sonar image | |
Huang et al. | Comprehensive sample augmentation by fully considering SSS imaging mechanism and environment for shipwreck detection under zero real samples | |
CN114034288B (en) | Seabed microtopography laser line scanning three-dimensional detection method and system | |
CN111445395A (en) | Method for repairing middle area of side-scan sonar waterfall image based on deep learning | |
Wang et al. | A feature-supervised generative adversarial network for environmental monitoring during hazy days | |
CN109781073A (en) | A kind of shallow water depth Remotely sensed acquisition method merging wave feature and spectral signature | |
CN116184376A (en) | Underwater three-dimensional terrain and multi-beam image sonar data simulation system and method | |
CN111723632A (en) | Ship tracking method and system based on twin network | |
CN116027349A (en) | Coral reef substrate classification method based on laser radar and side scan sonar data fusion | |
CN114299318A (en) | Method and system for rapid point cloud data processing and target image matching | |
Almanza-Medina et al. | Deep learning architectures for navigation using forward looking sonar images | |
Karaki et al. | Multi-platforms and multi-sensors integrated survey for the submerged and emerged areas | |
Xie et al. | Towards differentiable rendering for sidescan sonar imagery | |
CN115223033A (en) | Synthetic aperture sonar image target classification method and system | |
Sethuraman et al. | Towards sim2real for shipwreck detection in side scan sonar imagery | |
CN114494603A (en) | Simulation sonar image data generation method based on Unity3D | |
Rosenblum et al. | 3D reconstruction of small underwater objects using high-resolution sonar data | |
CN111260674A (en) | Method, system and storage medium for extracting target contour line from sonar image | |
Zhang et al. | Bridge substructure feature extraction based on the underwater sonar point cloud data | |
CN116597085B (en) | Three-dimensional flow field reconstruction method, system, electronic equipment and storage medium | |
US20240185423A1 (en) | System and method of feature detection for airborne bathymetry | |
Ji | Multi-Resolution Inference of Bathymetry From Sidescan Sonar |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |