CN116989763A - Fusion positioning and mapping method for amphibious unmanned system - Google Patents

Fusion positioning and mapping method for amphibious unmanned system Download PDF

Info

Publication number
CN116989763A
CN116989763A CN202310520140.0A CN202310520140A CN116989763A CN 116989763 A CN116989763 A CN 116989763A CN 202310520140 A CN202310520140 A CN 202310520140A CN 116989763 A CN116989763 A CN 116989763A
Authority
CN
China
Prior art keywords
data
fusion
gnss
factor
odometer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310520140.0A
Other languages
Chinese (zh)
Inventor
徐雍
易雪婷
陈梓杰
赵曜
刘畅
饶红霞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong University of Technology
Original Assignee
Guangdong University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong University of Technology filed Critical Guangdong University of Technology
Priority to CN202310520140.0A priority Critical patent/CN116989763A/en
Publication of CN116989763A publication Critical patent/CN116989763A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3841Data obtained from two or more sources, e.g. probe vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/203Specially adapted for sailing ships
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/47Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

The application discloses a fusion positioning and mapping method for an amphibious unmanned system, which comprises the steps of acquiring GNSS data and carrying out satellite screening to obtain filtered GNSS data; acquiring visual inertial data, and performing tight coupling fusion on the visual inertial data and the filtered GNSS data to obtain a global position; fusing the laser radar and the inertial measurement unit through an error state Kalman filter to obtain a fused odometer; and acquiring the fusion odometer data, and carrying out two-stage factor graph optimization by combining with the GNSS data to obtain the three-dimensional point cloud map. By using the application, the unmanned system can be positioned and mapped with high precision in an amphibious environment. The fusion positioning and mapping method for the amphibious unmanned system can be widely applied to the technical field of positioning and mapping.

Description

Fusion positioning and mapping method for amphibious unmanned system
Technical Field
The application relates to the technical field of positioning and mapping, in particular to a fusion positioning and mapping method for an amphibious unmanned system.
Background
The amphibious environment applied by the amphibious unmanned system is used as a complex multi-medium terrain environment, which not only comprises single water and land terrains, but also comprises an amphibious junction zone, and when the junction zone operates, the unmanned system often faces the problems that a shore shelter influences positioning accuracy, the amphibious switching is not smooth and the like. The amphibious unmanned system realizes real-time positioning and mapping of the surrounding environment in the motion process by utilizing the sensors carried by the amphibious unmanned system in the amphibious alternating environment, and has extremely important significance on whether the system can perform autonomous navigation in the complex environment. In recent years, researchers continuously improve real-time positioning and mapping technologies, but most of application scenes are directed to single scenes such as land scenes or a certain water surface environment, and a lot of technical gaps still exist in how to realize robust and efficient positioning and mapping under land-water switching multi-medium terrain environments.
Therefore, research on an amphibious unmanned system fusion positioning and mapping method facing amphibious environment is needed.
Disclosure of Invention
In order to solve the technical problems, the application aims to provide a fusion positioning and mapping method for an amphibious unmanned system, which realizes high-precision positioning and mapping of the unmanned system in an amphibious environment.
The technical scheme adopted by the application is as follows: an amphibious unmanned system-oriented fusion positioning and mapping method comprises the following steps:
acquiring GNSS data and performing satellite screening to obtain filtered GNSS data;
acquiring visual inertial data, and performing tight coupling fusion on the visual inertial data and the filtered GNSS data to obtain global positioning information;
fusing the laser radar and the inertial measurement unit through an error state Kalman filter to obtain a fused odometer;
and performing closed loop detection by combining the global positioning information and the local positioning information of the fusion odometer, and performing two-stage factor graph optimization on the GNSS data and the fusion odometer to obtain a three-dimensional point cloud map.
In one possible implementation manner based on the technical solution, the method further includes:
and performing pose estimation on the unmanned system based on a tight coupling fusion method of iterative error state Kalman on manifold.
In the implementation mode, aiming at the problems that the difference of each sensor is large and the stability of the system is influenced by nonlinear factors during optimization, the embodiment develops a tight coupling fusion method based on iterative error state Kalman on manifold, realizes a lightweight, efficient and multi-source fusion odometer, and provides proper map information and smooth pose information for a control system.
In one possible implementation manner based on the technical solution, the step of acquiring GNSS data and performing satellite screening to obtain filtered GNSS data specifically includes:
acquiring GNSS data and performing dimension reduction processing on the GNSS data to obtain preprocessed GNSS data;
acquiring a bottom view image and dividing the bottom view image to obtain a divided bottom view image;
and projecting the preprocessed GNSS data to the segmented bottom view image, and carrying out visibility screening on the ephemeris image by utilizing the segmented bottom view image to obtain filtered GNSS data.
In the implementation manner, an adaptive satellite screening method is provided for solving the problems of precision degradation, GNSS unlocking and the like caused by multipath effects of GNSS signals, and satellite availability screening is performed by utilizing a bottom view image, ephemeris and GNSS satellite signals.
In one possible implementation manner based on the technical solution, the step of obtaining a bottom view image and dividing the bottom view image to obtain a divided bottom view image specifically includes:
acquiring a look-up image;
setting an initial global threshold value and dividing the looking-up image based on the initial global threshold value to obtain a preliminary divided image;
calculating the gray average value of the preliminary segmentation image and taking the gray average value as a new global threshold value until the new global threshold value is smaller than a preset minimum threshold value, so as to obtain a final threshold value;
and dividing the shielded area and the non-shielded area of the bottom view image based on the final threshold value to obtain a divided bottom view image.
In this implementation, in order to segment the acquired looking-up image, a global threshold segmentation method of adaptive threshold is studied to obtain the looking-up occlusion region and the non-occlusion region of the amphibious unmanned system.
In one possible implementation manner based on the technical solution, the step of obtaining visual inertial data and performing tight coupling fusion on the visual inertial data and the filtered GNSS data to obtain global positioning information specifically includes:
acquiring measurement data of an inertial measurement unit and performing pre-integration to obtain a pre-integration result;
acquiring shooting data of a camera and reconstructing pure visual motion to obtain a reconstruction result;
performing vision-inertia alignment and initialization based on the pre-integration result and the reconstruction result to obtain vision inertia data;
and inputting the global consistent track to a sliding window optimizer for factor graph optimization, and obtaining global positioning information.
In the implementation mode, the measurement data of the camera and the inertial measurement unit are fully utilized, and the accuracy of system positioning is improved.
In one possible implementation manner based on the technical solution, the expression of the sliding window optimizer is as follows:
wherein ,representation set->Optimal estimation of->Representing the set of all key frames within the sliding window at time k,representing the relative constraint of the vision-inertial odometer between the i-th moment and the j-th moment, +.>Indicating the relative constraint of the GNSS at time i, +.>A priori constraints representing the first state quantity in the sliding window, Σ VIo 、∑ GNSS 、∑ 0 The covariance matrix of each constraint is represented separately.
In the implementation mode, aiming at the problem that the GNSS and the visual inertial system are not matched in order of magnitude of precision, the formula is adopted to construct a sliding window optimizer, and factor graph optimization is carried out through the sliding window optimizer so as to obtain a map with high global consistency.
In one possible implementation manner based on the technical scheme, the step of performing closed loop detection by combining global positioning information and local positioning information of the fusion odometer, and performing two-stage factor graph optimization on GNSS data and the fusion odometer to obtain a three-dimensional point cloud map specifically comprises the following steps:
acquiring a fusion odometer factor and a GNSS factor according to the fusion odometer data and the GNSS data correspondingly, and adding the fusion odometer factor and the GNSS factor to the factor graph to perform first-stage secondary factor graph optimization to obtain a primary map;
and performing closed loop detection by combining the global information and the local information, adding the closed loop detection to the primary map as constraint, resetting the factor graph, and performing secondary factor graph optimization in the second stage to obtain the three-dimensional point cloud map.
In the implementation mode, a two-stage optimization strategy capable of greatly reducing influence caused by GNSS outliers is designed aiming at the problem that a point cloud map is distorted and degenerated due to possible errors of GNSS signals.
In one possible implementation manner based on the technical scheme, the step of correspondingly acquiring the fusion odometer factor and the GNSS factor according to the fusion odometer data and the GNSS data, and adding the fusion odometer factor and the GNSS factor to the factor graph to perform first-stage secondary factor graph optimization to obtain a preliminary map specifically comprises the following steps:
correspondingly acquiring a fusion odometer factor and a GNSS factor according to the fusion odometer data and the GNSS data;
adding the fusion odometer factor and the GNSS factor to the factor graph, selecting a Ke Xilu rod kernel function to restrict the first optimization and carrying out chi-square detection to obtain a first optimization result;
and deleting Ke Xilu bar kernel functions, and then performing first-stage second factor graph optimization on the first optimization result to obtain a preliminary map.
In the implementation mode, in order to remove outliers to the greatest extent, ke Xilu rod kernel functions are selected to restrict first-stage first optimization and carry out chi-square detection.
In one possible implementation manner based on the technical solution, the step of introducing closed loop detection as a constraint specifically includes:
performing false closed loop detection on the candidate closed loops;
mapping laser radar data, millimeter wave radar data and sonar data in the fused odometer data into a 2D image, and performing closed-loop detection on the 2D image to obtain a rough position;
the resolution of the 2D image is progressively increased through a fixed factor, closed loop detection is carried out on a plurality of resolution images, and the rough position is corrected to obtain a correct closed loop;
the correct closed loop is added as a constraint to the second stage of quadratic factor graph optimization.
In the implementation mode, aiming at the problem of large accumulated error of long-time graph construction, the multi-resolution closed-loop detection method based on the position is provided, and compared with the traditional closed-loop detection algorithm based on the appearance, the method provided by the application is easier to realize and has good operation.
The method has the beneficial effects that: the visual-defensive-inertial navigation global positioning method can improve the precision and stability of global positioning and provide reliable guarantee for autonomous navigation and map construction of the amphibious unmanned system in an amphibious medium environment; the global consistency and the precision of the point cloud map can be improved by a stable map construction method facing to the amphibious alternating scene.
Drawings
FIG. 1 is an overall frame diagram of an amphibious unmanned system-oriented fusion positioning and mapping method of the present application
Figure 2 is a flow chart of the steps of a fusion positioning and mapping method for an amphibious unmanned system of the present application;
FIG. 3 is a block diagram of satellite feasibility screening according to an embodiment of the application;
FIG. 4 is a frame diagram of a close-coupled fusion in accordance with an embodiment of the present application;
FIG. 5 is a factor graph optimization schematic of an embodiment of the present application;
FIG. 6 is a frame diagram of a multi-source fusion odometer in accordance with an embodiment of the application;
FIG. 7 is a frame diagram of a multi-resolution cross-normal distributed point cloud registration method in accordance with an embodiment of the present application;
figure 8 is a block diagram of a stabilized mapping method for an alternate amphibious scene in accordance with an embodiment of the application;
FIG. 9 is a schematic diagram of a multi-resolution closed-loop detection method according to an embodiment of the present application.
Detailed Description
The application will now be described in further detail with reference to the drawings and to specific examples. The step numbers in the following embodiments are set for convenience of illustration only, and the order between the steps is not limited in any way, and the execution order of the steps in the embodiments may be adaptively adjusted according to the understanding of those skilled in the art.
Aiming at complex amphibious environment, there is a great need to improve the accuracy, real-time performance and long-term working stability of positioning and mapping of the amphibious robot. As shown in figure 1, the application achieves the aim of improving the robustness and smoothness of positioning and mapping of the amphibious robot in the amphibious environment by designing a global positioning method based on vision-sanitation-inertial navigation, a multi-resolution map tight coupling odometer based on a ranging sensor and a stable mapping method facing the amphibious alternating scene.
As shown in fig. 2, the application provides a fusion positioning and mapping method for an amphibious unmanned system, which comprises the following steps:
s1, a global positioning method based on vision-sanitation-inertial navigation;
aiming at the problems that GNSS is poor or missing in signals and accumulated errors cannot be eliminated due to long-term operation caused by multipath effects, terrain shielding and other factors in an amphibious medium environment, the embodiment provides a global positioning method based on vision-sanitation-inertial navigation, and the accuracy of global positioning of an amphibious unmanned system is ensured.
S1.1, referring to FIG. 3, satellite feasibility screening;
s1.1.1, acquiring GNSS data and performing dimension reduction processing on the GNSS data to obtain preprocessed GNSS data;
and carrying out external calibration and yaw alignment on three-dimensional satellite data acquired by the unmanned system, carrying out dimension reduction on the calibrated three-dimensional satellite data in order to match with the images after the bottom view segmentation, and carrying out intrinsic calibration and distortion change.
S1.1.2, acquiring a bottom view image and dividing the bottom view image to obtain a divided bottom view image;
in order to segment the acquired bottom view image, the embodiment researches a global threshold segmentation method of a self-adaptive threshold, and after the bottom view image is acquiredFirstly, selecting an initial value T (generally selecting average gray level) for a global threshold value, then dividing an image by using the threshold value, and calculating the average gray level m of two divided images 1 and m2 And calculates the average of the two gray scales as a new global thresholdUntil T is less than a set minimum threshold dT to obtain a heads-up occluded and non-occluded area of the amphibious unmanned system.
S1.1.3, projecting the preprocessed GNSS data to the segmented look-up image, and performing visibility screening on the ephemeris image by using the segmented look-up image to obtain filtered GNSS data.
The ephemeris is projected into the segmented upward-looking images by using the airborne GNSS sensor and the outward-looking matrix of the upward-looking camera, the segmented upward-looking images are used for carrying out visibility screening on the ephemeris images, the shielded satellites are filtered, the influence of the GNSS caused by the multipath effect due to shielding is avoided, and the separation of the satellite direct line-of-sight signals and the non-line-of-sight signals is completed.
S1.2, referring to FIG. 4, acquiring visual inertial data, and performing tight coupling fusion on the visual inertial data and the filtered GNSS data to obtain global positioning information;
the visual inertial data and the GNSS observation data are tightly coupled and fused under the frame of probability, on one hand, the camera image is processed by a pure visual motion reconstruction (Structure from motion, sfM for short) method and is subjected to visual-inertial alignment with a pre-integration result of an inertial measurement unit, and after initialization is completed by the visual-inertial alignment, three-dimensional satellite coordinates are subjected to yaw alignment so as to complete visibility analysis of satellites, and the process fully utilizes the measurement data of the camera and the inertial measurement unit, so that the accuracy of the system is improved. Meanwhile, after preprocessing, the camera data, the inertial measurement unit data and the filtered GNSS data are input into a sliding window optimizer together for factor graph optimization so as to obtain a map with high global consistency. Wherein the factor graph optimization is schematically shown in FIG. 5, wherein L 1 、L 2 Finger visionInverse depth of the sense feature points.
Aiming at the problem of unmatched precision orders of the GNSS and the visual inertial system, the sliding window optimizer is constructed by adopting the following formula:
wherein ,representation set->Is an optimal estimate of (1); />Representing a set of all key frames in the sliding window at the k moment;representing the relative constraint of the vision-inertial odometer between the i-th moment and the j-th moment; />Representing the relative constraint of the GNSS at the ith moment; />A priori constraints representing a first state quantity in the sliding window; sigma and method for producing the same VI 、∑ GNSS 、∑ 0 The covariance matrix of each constraint is represented separately.
S2, carrying out pose estimation on the unmanned system based on a tight coupling fusion method of iterative error state Kalman on manifold;
aiming at the problems that the difference of each sensor is large and the stability of the system is influenced by nonlinear factors during optimization, the embodiment develops a tight coupling fusion method based on iterative error state Kalman on manifold, and realizes the lightweight, efficient and multisource fusion odometer. The iteration error Kalman based on manifold is realized by predicting and updating sensor data, the prediction quantity is obtained by pre-integrating an inertial measurement unit, the observation quantity is obtained by performing multi-resolution cross normal distribution transformation point cloud registration among three ranging sensors of a laser radar, a millimeter wave radar and a sonar, and the current state value can be obtained by substituting the prediction quantity, the observation quantity and constraint into the iteration error Kalman based on manifold.
S2.1, rapidly acquiring a predicted value of the system through pre-integration by utilizing the high-frequency characteristic of the inertial measurement unit, and taking the predicted value as the prior probability of the system.
S2.2, referring to FIG. 7, multi-resolution cross normal distribution transformation point cloud registration is carried out among three ranging sensors of a laser radar, a millimeter wave radar and a sonar.
S2.2.1, the reference point cloud is divided into grid units or voxels with specified sizes according to practical situations. According to the embodiment, laser radars, millimeter wave radars and sonar voxels with different sizes are respectively arranged according to different accuracies of the sensors.
S2.2.2 assuming that the point cloud of each minimum lattice in the reference point cloud satisfies gaussian distribution, the mean vector μ and covariance matrix Σ of the point set to which the reference surface belongs can be calculated according to the following:
where m is the number of data points in the grid cell, k represents an index value, k ε [1:m ]],x k Represents the kth position point in the m data points.
S2.2.3 when the front sensor is aligned, the matrix is transformed by the poseFrame P of current ranging sensor data B Projected to the world coordinate system and two other ranging transmissionsThe map data corresponding to the sensor is subjected to normal distribution point cloud registration, and a projection formula is as follows:
s2.2.4, matching the point cloud of the current scanning with the reference normal distribution point cloud model constructed in the first step with the maximum possibility, and finding the optimal current scanning pose.
Let the point cloud x= { X of the current scan 1 ,…,x n The curved surface distribution of the reference scan is D-dimensional normal distribution, and then the probability density function of the reference point cloud is thatThe pose variable with the largest value of the following likelihood function is the optimal pose T:
or equivalently, the pose variable T that minimizes the negative log-likelihood function as follows:
wherein fmw (T, x) k ) Representing the pose T versus the kth position point x k A transformation function that performs a spatial transformation, and f (fmw (T, x k ) Point x) represents k The probability density function of the position points obtained after T transformation, log (f (fmw (T, x) k ) And) represents the sum of the values of f (fmw (T, x) k ) Log is taken.
The smaller values of S2.2.5, -log, indicate a greater likelihood that a point in the point set X will be located on the curved surface of the reference scan after it has been transformed into the reference coordinate system by T. Therefore, according to Newton iterative algorithm
Thereby obtaining the optimal transformation pose T for matching the current point cloud with the reference scanning point cloud.
Fully considering the degradation condition of the sensor, continuously observing covariance, detecting the global positioning condition when the covariance is larger than a threshold value, and adding global constraint as observation if the global positioning is not degraded so as to reduce the covariance and maintain the accuracy of the system; if the global positioning fails, fusion degradation processing is carried out to ensure that covariance is not dispersed, a gyroscope of an inertial measurement unit is utilized for prediction, an accelerometer is used for observation, the accelerometer provides an absolute reference of a horizontal position, and meanwhile, a magnetometer is added to provide an absolute reference in the north direction, so that attitude calculation of an automatic heading reference system (Automatic Heading Reference System) is carried out, and smoothness of the odometer is ensured at the minimum.
S2.3, a tight coupling fusion method based on iterative error state Kalman on manifold.
The flow of the iterative error state Kalman-based tight coupling fusion method is as follows:
first, an error state of the current state is acquired:
wherein δp, δv, δθ, δb a ,δb g δg represents errors of a position vector, a velocity vector, an angular axis vector, an offset vector of an inertial measurement unit accelerometer, an offset vector of a gyroscope, and a gravitational acceleration vector, respectively.
Then, a kalman filter is performed in the error state δx to estimate the trajectory of the error state.
Finally, the updated error state deltax is added back to the nominal state to be used as the estimation value updated by the Kalman filtering, namely:
wherein x represents the current state, andrepresenting the nominal state of x, δx representing the error state of x; δp, δv, δθ, δb a 、δb g Delta g represents ∈ ->Error state of (2); />Represents addition in a broad sense, < >>Representing the quaternion product, exp (δθ) represents the mapping of the angular axis vector onto the quaternion rotation.
And in the updating process, the working points are continuously switched to carry out iterative updating. Compared with other state estimation technologies, the tightly coupled fusion method provided by the embodiment has the advantages of high precision, high efficiency and strong stability, overcomes the influence caused by nonlinear factors in a system motion equation and an observation equation, and provides an elegant Kalman filtering framework for a nonlinear amphibious unmanned system.
Aiming at the problem that a narrow and loose coupling fusion method between sensor common areas cannot provide a high-consistency odometer under an amphibious alternating environment, the embodiment designs a multi-resolution map tight coupling odometer frame based on a ranging sensor, carries out tight coupling fusion on a laser radar, a millimeter wave radar, a sonar and an inertial sensor, and builds a multi-resolution map, as shown in fig. 6.
S3, a stable mapping method for an amphibious alternating scene.
Aiming at the problem that the point cloud map is distorted and degenerated due to possible errors of GNSS signals, the embodiment designs a two-stage optimization strategy capable of greatly reducing the influence caused by GNSS outliers. The frame diagram is shown in fig. 8.
Due toThe system calculates the longitude and latitude to have a certain dependence on GNSS signals, and may output error information even in the fixed solution, so that the information matrix of GNSS is utilized to represent the quality of the signals, the error amount of GNSS factorsThe following formula can be used for the expression:
in the formula ygnss,k E6 degrees of freedom pose constraints provided by SE (3) for GNSS, x k E SE (3) is 6-degree-of-freedom pose at k moment, symbolRepresenting subtraction on manifold, log operation is a logarithmic mapping from SE (3) to SE (3) and converted to vector spaceWherein SE (3) represents a special Euclidean group, and SE (3) is a corresponding lie algebra of SE (3).
S3.1, fusing the laser radar and the inertial measurement unit through an error state Kalman filter to obtain a fused odometer;
and fusing the laser radar and the inertial measurement unit through an error state Kalman filter to form a fused odometer. The method comprises the following specific steps: firstly, carrying out short-time rapid estimation on an inertial measurement unit based on an error state Kalman filter to obtain self-estimation information of an unmanned system carrier; and then taking the self-estimation information as an initial value of the laser radar to estimate the pose of the laser radar, thereby realizing fusion.
The relative motion error produced by this fused odometer can be expressed using the following formula:
in the formula To fuse the relative motion errors, y, of the odometer at the k, j time k,j As the relative measurement value of the laser radar at the k and j time points, x k and xj And the pose of the unmanned system at the kth moment and the jth moment are respectively shown.
And S3.2, correspondingly acquiring a fusion odometer factor and a GNSS factor according to the fusion odometer data and the GNSS data, and adding the fusion odometer factor and the GNSS factor to the factor graph to perform first-stage secondary factor graph optimization to obtain a primary map.
The factor graph comprises nodes and edges, wherein the nodes refer to variables to be optimized, the edges refer to factors, and constraint conditions can be formed on the nodes. The fusion odometer factor and the GNSS factor are obtained through GNSS data and fusion odometer data respectively, and can form constraint on nodes in the factor graph.
Optimizing stage one:
after the fusion odometer factor and the GNSS factor are obtained, the fusion odometer factor and the GNSS factor are added into the factor diagram to perform first-stage factor diagram optimization, and meanwhile, in order to remove outliers to the greatest extent, ke Xilu rod kernel functions are selected to restrict the first optimization and perform chi-square detection.
Wherein Ke Xilu bar kernel function ρ is defined as:
where c is the control parameter and s is the square term of the error.
After outliers are removed, ke Xilu bar kernel functions are removed to conduct first-stage second factor graph optimization, and a rough map is obtained.
And (2) an optimization stage II:
and loop detection is carried out to obtain a loop retrieving factor, and secondary factor diagram optimization of the second stage is carried out after the factor diagram is reset, so that a map with higher precision and stronger robustness is obtained. The whole mapping process can be modeled as a batch maximum posterior estimation problem, and a specific mathematical model is shown as follows:
in the formula ,rodom ,r gnss ,r loop Respectively an odometer constraint residual error, a guard constraint residual error and a closed loop constraint residual error, Σ odom ,Σ gnss ,Σ loop Covariance matrices corresponding to the constraints are respectively obtained,keyframe sets of odometer, closed loop, and guide respectively, and +.>Then it is the set of poses of the robot.
Aiming at the problem of large accumulated error in long-time graph construction, the embodiment provides a multi-resolution closed-loop detection method based on the position. The location-based closed loop detection algorithm only checks within a certain range (i.e., within 20 m) to prevent false closed loops. Because a better GNSS measurement value has been obtained in the optimization stage to ensure that the system accumulated error does not exceed the search range, the method proposed in this embodiment is easier to implement and operates well than the conventional appearance-based closed-loop detection algorithm. The flow chart of the method is shown in fig. 9.
The method for multi-resolution closed-loop detection based on the position provided by the embodiment comprises the following specific steps:
first, candidate closed loops within a threshold range are checked to prevent the occurrence of false closed loops. By comparing the current global positioning information with the historical global positioning information, the candidate closed loop is close, and the false closed loop detection is to judge whether the difference is lower than a set threshold value. If the difference is greater than the threshold, it is omitted, which results in a significant saving in computational resources.
Then, performing closed-loop detection by adopting a multi-sensor multi-resolution searching method, wherein the method maps data of a laser radar, a millimeter wave radar and a sonar into a 2D image, performs closed-loop detection on a low-resolution image to obtain a coarser position, then increases the resolution of the picture by a fixed factor, performs closed-loop detection on a plurality of resolution images, and adds the closed loop as a constraint into the second optimization stage. After closed loop detection is added, when the carrier moves to the region which is reached previously, additional constraint can be provided for the pose graph, and the global accuracy of the track is improved; meanwhile, objects with common view can be further optimized, so that accumulated errors are eliminated, and map ghosts are reduced.
Fusion positioning and image building device for amphibious unmanned system:
at least one processor;
at least one memory for storing at least one program;
the at least one program, when executed by the at least one processor, causes the at least one processor to implement a fused localization and mapping method for an amphibious unmanned system as described above.
The content in the method embodiment is applicable to the embodiment of the device, and the functions specifically realized by the embodiment of the device are the same as those of the method embodiment, and the obtained beneficial effects are the same as those of the method embodiment.
A storage medium having stored therein instructions executable by a processor, characterized by: the processor executable instructions when executed by the processor are for implementing a fused positioning and mapping method for an amphibious unmanned system as described above.
The content in the method embodiment is applicable to the storage medium embodiment, and functions specifically implemented by the storage medium embodiment are the same as those of the method embodiment, and the achieved beneficial effects are the same as those of the method embodiment.
While the preferred embodiment of the present application has been described in detail, the application is not limited to the embodiment, and various equivalent modifications and substitutions can be made by those skilled in the art without departing from the spirit of the application, and these equivalent modifications and substitutions are intended to be included in the scope of the present application as defined in the appended claims.

Claims (9)

1. The fusion positioning and mapping method for the amphibious unmanned system is characterized by comprising the following steps of:
acquiring GNSS data and performing satellite screening to obtain filtered GNSS data;
acquiring visual inertial data, and performing tight coupling fusion on the visual inertial data and the filtered GNSS data to obtain global positioning information;
fusing the laser radar and the inertial measurement unit through an error state Kalman filter to obtain a fused odometer;
and performing closed loop detection by combining the global positioning information and the local positioning information of the fusion odometer, and performing two-stage factor graph optimization on the GNSS data and the fusion odometer to obtain a three-dimensional point cloud map.
2. The amphibious unmanned system-oriented fusion positioning and mapping method of claim 1, further comprising:
and performing pose estimation on the unmanned system based on a tight coupling fusion method of iterative error state Kalman on manifold.
3. The method for fusion positioning and mapping of amphibious unmanned system according to claim 1, wherein the steps of obtaining GNSS data and performing satellite screening to obtain filtered GNSS data comprise:
acquiring GNSS data and performing dimension reduction processing on the GNSS data to obtain preprocessed GNSS data;
acquiring a bottom view image and dividing the bottom view image to obtain a divided bottom view image;
and projecting the preprocessed GNSS data to the segmented bottom view image, and carrying out visibility screening on the ephemeris image by utilizing the segmented bottom view image to obtain filtered GNSS data.
4. A fusion positioning and mapping method for an amphibious unmanned system according to claim 3, wherein the steps of obtaining a bottom view image and dividing the bottom view image to obtain a divided bottom view image specifically comprise:
acquiring a look-up image;
setting an initial global threshold value and dividing the looking-up image based on the initial global threshold value to obtain a preliminary divided image;
calculating the gray average value of the preliminary segmentation image and taking the gray average value as a new global threshold value until the new global threshold value is smaller than a preset minimum threshold value, so as to obtain a final threshold value;
and dividing the shielded area and the non-shielded area of the bottom view image based on the final threshold value to obtain a divided bottom view image.
5. The method for fusion positioning and mapping of amphibious unmanned system according to claim 1, wherein the steps of obtaining visual inertial data and performing tight coupling fusion on the visual inertial data and the filtered GNSS data to obtain global positioning information specifically comprise:
acquiring measurement data of an inertial measurement unit and performing pre-integration to obtain a pre-integration result;
acquiring shooting data of a camera and reconstructing pure visual motion to obtain a reconstruction result;
performing vision-inertia alignment and initialization based on the pre-integration result and the reconstruction result to obtain vision inertia data;
and inputting the global consistent track to a sliding window optimizer for factor graph optimization, and obtaining global positioning information.
6. The fusion positioning and mapping method for an amphibious unmanned system according to claim 5, wherein the expression of the sliding window optimizer is as follows:
wherein ,representation set->Optimal estimation of->Representing the set of all key frames within the sliding window at time k,/>Representing the relative constraint of the vision-inertial odometer between the i-th moment and the j-th moment, +.>Indicating the relative constraint of the GNSS at time i, +.>A priori constraints representing the first state quantity in the sliding window, Σ VIO 、∑ GNSS 、∑ 0 The covariance matrix of each constraint is represented separately.
7. The method for fusion positioning and mapping of amphibious unmanned system according to claim 1, wherein the steps of performing closed loop detection by combining global positioning information and local positioning information of a fusion odometer, and performing two-stage factor map optimization on GNSS data and the fusion odometer to obtain a three-dimensional point cloud map specifically comprise:
acquiring a fusion odometer factor and a GNSS factor correspondingly according to the fusion odometer data and the GNSS data, and adding the fusion odometer factor and the GNSS factor to the factor graph to perform first-stage secondary factor graph optimization to obtain a primary map;
and performing closed loop detection by combining the global information and the local information, adding the closed loop detection to the primary map as constraint, resetting the factor graph, and performing secondary factor graph optimization in the second stage to obtain the three-dimensional point cloud map.
8. The method for fusion positioning and mapping of amphibious unmanned system according to claim 7, wherein the steps of obtaining the fusion odometer factor and the GNSS factor according to the correspondence of the fusion odometer data and the GNSS data, adding the fusion odometer factor and the GNSS factor to the factor graph for first-stage secondary factor graph optimization and obtaining the preliminary map specifically comprise the following steps:
correspondingly acquiring a fusion odometer factor and a GNSS factor according to the fusion odometer data and the GNSS data;
adding the fusion odometer factor and the GNSS factor to the factor graph, selecting a Ke Xilu rod kernel function to restrict the first optimization and carrying out chi-square detection to obtain a first optimization result;
and deleting Ke Xilu bar kernel functions, and then performing first-stage second factor graph optimization on the first optimization result to obtain a preliminary map.
9. The method for fusion positioning and mapping of an amphibious unmanned system according to claim 7, wherein the step of closed loop detection is used as a constraint, comprises the following steps:
performing false closed loop detection on the candidate closed loops;
mapping laser radar data, millimeter wave radar data and sonar data in the fused odometer data into a 2D image, and performing closed-loop detection on the 2D image to obtain a rough position;
the resolution of the 2D image is progressively increased through a fixed factor, closed loop detection is carried out on a plurality of resolution images, and the rough position is corrected to obtain a correct closed loop;
the correct closed loop is added as a constraint to the second stage of quadratic factor graph optimization.
CN202310520140.0A 2023-05-10 2023-05-10 Fusion positioning and mapping method for amphibious unmanned system Pending CN116989763A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310520140.0A CN116989763A (en) 2023-05-10 2023-05-10 Fusion positioning and mapping method for amphibious unmanned system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310520140.0A CN116989763A (en) 2023-05-10 2023-05-10 Fusion positioning and mapping method for amphibious unmanned system

Publications (1)

Publication Number Publication Date
CN116989763A true CN116989763A (en) 2023-11-03

Family

ID=88532871

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310520140.0A Pending CN116989763A (en) 2023-05-10 2023-05-10 Fusion positioning and mapping method for amphibious unmanned system

Country Status (1)

Country Link
CN (1) CN116989763A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112304307A (en) * 2020-09-15 2021-02-02 浙江大华技术股份有限公司 Positioning method and device based on multi-sensor fusion and storage medium
CN113340295A (en) * 2021-06-16 2021-09-03 广东工业大学 Unmanned ship near-shore real-time positioning and mapping method with multiple ranging sensors
CN113837277A (en) * 2021-09-24 2021-12-24 东南大学 Multisource fusion SLAM system based on visual point-line feature optimization
CN114608561A (en) * 2022-03-22 2022-06-10 中国矿业大学 Positioning and mapping method and system based on multi-sensor fusion
CN115479598A (en) * 2022-08-23 2022-12-16 长春工业大学 Positioning and mapping method based on multi-sensor fusion and tight coupling system
CN115657101A (en) * 2022-10-26 2023-01-31 北京航空航天大学 Method for assisting GNSS-INS (global navigation satellite system-inertial navigation system) high-precision navigation and positioning by fisheye camera
US20230118945A1 (en) * 2021-11-02 2023-04-20 Apollo Intelligent Driving Technology (Beijing) Co., Ltd. Method of processing data for autonomous vehicle, electronic device, storage medium and autonomous vehicle

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112304307A (en) * 2020-09-15 2021-02-02 浙江大华技术股份有限公司 Positioning method and device based on multi-sensor fusion and storage medium
CN113340295A (en) * 2021-06-16 2021-09-03 广东工业大学 Unmanned ship near-shore real-time positioning and mapping method with multiple ranging sensors
CN113837277A (en) * 2021-09-24 2021-12-24 东南大学 Multisource fusion SLAM system based on visual point-line feature optimization
US20230118945A1 (en) * 2021-11-02 2023-04-20 Apollo Intelligent Driving Technology (Beijing) Co., Ltd. Method of processing data for autonomous vehicle, electronic device, storage medium and autonomous vehicle
CN114608561A (en) * 2022-03-22 2022-06-10 中国矿业大学 Positioning and mapping method and system based on multi-sensor fusion
CN115479598A (en) * 2022-08-23 2022-12-16 长春工业大学 Positioning and mapping method based on multi-sensor fusion and tight coupling system
CN115657101A (en) * 2022-10-26 2023-01-31 北京航空航天大学 Method for assisting GNSS-INS (global navigation satellite system-inertial navigation system) high-precision navigation and positioning by fisheye camera

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
徐昊玮;廉保旺;刘尚波;: "基于滑动窗迭代最大后验估计的多源组合导航因子图融合算法", 兵工学报, no. 04, 15 April 2019 (2019-04-15) *

Similar Documents

Publication Publication Date Title
US11585662B2 (en) Laser scanner with real-time, online ego-motion estimation
CN113345018B (en) Laser monocular vision fusion positioning mapping method in dynamic scene
CN111583369B (en) Laser SLAM method based on facial line angular point feature extraction
CN112304307A (en) Positioning method and device based on multi-sensor fusion and storage medium
CN106595659A (en) Map merging method of unmanned aerial vehicle visual SLAM under city complex environment
US20110274343A1 (en) System and method for extraction of features from a 3-d point cloud
CN114424250A (en) Structural modeling
CN114359476A (en) Dynamic 3D urban model construction method for urban canyon environment navigation
CN114325634A (en) Method for extracting passable area in high-robustness field environment based on laser radar
CN114648584B (en) Robustness control method and system for multi-source fusion positioning
Venable et al. Large scale image aided navigation
CN111735451A (en) Point cloud matching high-precision positioning method based on multi-source prior information
CN116681733B (en) Near-distance real-time pose tracking method for space non-cooperative target
CN116147618B (en) Real-time state sensing method and system suitable for dynamic environment
CN116879870A (en) Dynamic obstacle removing method suitable for low-wire-harness 3D laser radar
CN116989763A (en) Fusion positioning and mapping method for amphibious unmanned system
Falchetti et al. Probability hypothesis density filter visual simultaneous localization and mapping
Escourrou et al. Ndt localization with 2d vector maps and filtered lidar scans
Skjellaug Feature-Based Lidar SLAM for Autonomous Surface Vehicles Operating in Urban Environments
CN117593650B (en) Moving point filtering vision SLAM method based on 4D millimeter wave radar and SAM image segmentation
Weitbrecht Monte Carlo localization in dynamic environments based on an automotive Lidar sensor cocoon
CN117689588A (en) Laser SLAM method for two-dimensional-three-dimensional structural feature joint constraint
CN117421384A (en) Visual inertia SLAM system sliding window optimization method based on common view projection matching
Zeng et al. Entropy-based Keyframe Established and Accelerated Fast LiDAR Odometry and Mapping
최성혁 Image Inconsistency Mitigation Algorithm for IMU/Vision Localization based on Aerial Scene Matching

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination