CN112562052A - Real-time positioning and mapping method for near-shore water area - Google Patents
Real-time positioning and mapping method for near-shore water area Download PDFInfo
- Publication number
- CN112562052A CN112562052A CN202011395614.6A CN202011395614A CN112562052A CN 112562052 A CN112562052 A CN 112562052A CN 202011395614 A CN202011395614 A CN 202011395614A CN 112562052 A CN112562052 A CN 112562052A
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- data
- frame
- unmanned ship
- view
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/005—General purpose rendering architectures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01D—MEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
- G01D21/00—Measuring or testing not otherwise provided for
- G01D21/02—Measuring two or more variables by means not covered by a single other subclass
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/04—Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
- G06Q10/047—Optimisation of routes or paths, e.g. travelling salesman problem
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G3/00—Traffic control systems for marine craft
- G08G3/02—Anti-collision systems
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Human Resources & Organizations (AREA)
- Strategic Management (AREA)
- Economics (AREA)
- Computer Graphics (AREA)
- Biomedical Technology (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Evolutionary Computation (AREA)
- Data Mining & Analysis (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computational Linguistics (AREA)
- Biophysics (AREA)
- Development Economics (AREA)
- Molecular Biology (AREA)
- Game Theory and Decision Science (AREA)
- Artificial Intelligence (AREA)
- Entrepreneurship & Innovation (AREA)
- Marketing (AREA)
- Operations Research (AREA)
- Quality & Reliability (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Ocean & Marine Engineering (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Automation & Control Theory (AREA)
- Navigation (AREA)
Abstract
The invention discloses a real-time positioning and mapping method facing to a near-shore water area, which comprises the steps of acquiring data by a plurality of fisheye cameras and then obtaining normalized coordinates of feature points through calculation; the inertia measurement unit acquires inertia data of the unmanned ship; sonar collects sonar point cloud data around the unmanned boat; the Beidou satellite navigation system positions the current coordinate of the unmanned ship and calculates the position of a global plane coordinate system; the magnetometer measures the measurement value of the current magnetic field intensity and calculates the unmanned ship attitude; the barometer measures the air pressure so as to calculate the height of the unmanned ship; performing linear interpolation on the inertial data and the characteristic point normalized coordinates to obtain a multi-view frame pose; and generating a grid map and a three-dimensional point cloud map by combining the multi-view frame position with the global plane coordinate system position, the unmanned ship posture, the unmanned ship height and sonar point cloud data through loop detection and mapping. The real-time positioning and mapping method for the near-shore water area solves the problem that the existing data acquisition and processing method cannot be accurately positioned when being applied to the near-shore water area.
Description
Technical Field
The invention relates to the technical field of data acquisition and processing of an offshore water area, in particular to a real-time positioning and mapping method for an offshore water area.
Background
The current real-time positioning and mapping systems on the market are mostly aimed at the land and low-altitude environments where indoor service robots, unmanned vehicles and unmanned aerial vehicles are located. However, in the offshore and offshore environment, because the offshore and offshore environment only has a unilateral reference object, the existing visual inertial odometer based on a monocular or binocular camera is used, the narrow visual field cannot detect enough characteristic points to realize accurate real-time positioning, and the situation of inaccurate positioning occurs. In addition, in the currently used real-time positioning and mapping system for constructing the grid map and the three-dimensional point cloud map, a sensor mainly used is a laser radar, but because the effective working environment of the laser radar is on the water surface, the environment under the water surface is unknown, and the established grid map does not provide great help for path planning and autonomous obstacle avoidance of the unmanned ship.
Disclosure of Invention
Aiming at the defects, the invention aims to provide a real-time positioning and mapping method for the near-shore water area, and solves the problem that the existing data acquisition and processing method cannot be accurately positioned when being applied to the near-shore water area.
In order to achieve the purpose, the invention adopts the following technical scheme: a real-time positioning and mapping method facing to an offshore water area comprises the following steps:
the step A comprises the following steps:
step A1: acquiring data by a plurality of fisheye cameras, and then obtaining feature point normalization coordinates by resolving;
step A2: the method comprises the steps that an inertia measurement unit obtains inertia data PVQB of the unmanned ship;
step A3: sonar collects sonar point cloud data around the unmanned boat;
step A4: the Beidou satellite navigation system positions the current coordinate of the unmanned ship and calculates the position of a global plane coordinate system;
step A5: the magnetometer measures the measured value of the current magnetic field intensity and calculates the unmanned ship posture;
step A6: the barometer measures the air pressure so as to calculate the height of the unmanned ship;
and B:
after linear interpolation is carried out on the inertial data PVQB, the pose of the multi-view frame is obtained by combining the feature point normalized coordinates;
and C: and generating a grid map and a three-dimensional point cloud map by combining the multi-view frame position with the global plane coordinate system position, the unmanned ship posture, the unmanned ship height and the sonar point cloud data through loop detection and mapping.
The invention has the beneficial effects that: according to the near-shore water area-oriented real-time positioning and mapping method, accurate real-time positioning can be achieved underwater through the fisheye camera, the Beidou satellite navigation system, the magnetometer, the barometer, the inertia measurement unit and the sonar, then data are processed through the loop detection mapping step, so that a grid-out map and a three-dimensional point cloud map suitable for the near-shore water area are constructed, unmanned ship pose estimation, real-time relocation and closed-loop detection can be achieved, and the purposes of path planning and self-service obstacle avoidance of the unmanned ship are achieved.
The plurality of fisheye cameras are arranged at the top of the unmanned boat, a single fisheye camera can provide 180-degree visual field, and the overlapped part of the visual fields is beneficial to acquiring accurate depth information of the characteristic points; the Beidou satellite navigation system is arranged on two sides of the unmanned boat; the sonar is arranged at the bottom of the unmanned boat; the inertia measurement unit, the barometer and the magnetometer are installed on the unmanned ship and arranged below the fisheye camera.
The real-time positioning and mapping method for the offshore water area can realize the self-adaptive synchronization of the timestamps of multiple sensors consisting of a fisheye camera, a Beidou satellite navigation system, a magnetometer, a barometer, an inertial measurement unit and a sonar, and solves the problem that the state estimation of the system is influenced by different timestamps generated by time delay, data transmission or an independent clock of each sensor.
In addition, the pose of the multi-view frame, Beidou satellite positioning data, magnetometer data and barometer data together construct a maximum likelihood estimation problem, and accurate global positioning can be achieved in an offshore environment. The maximum likelihood estimation problem is an algorithm with the maximum probability that the measurement values of all the sensors are known and the measurement values can be obtained under the condition of the pose.
The sonar is 360 sonars of ping, uses two 360 sonars of ping to construct grid map and three-dimensional point cloud map, can detect out the barrier that threatens to unmanned ship under the surface of water, carries out real-time map construction to the environment under water simultaneously, assists unmanned ship to realize autonomous navigation.
Drawings
FIG. 1 is a flow chart in one embodiment of the invention;
FIG. 2 is a flow chart of step A in one embodiment of the present invention;
FIG. 3 is a flow chart of step B in one embodiment of the present invention;
FIG. 4 is a flow chart of step C in one embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
The following disclosure provides many different embodiments or examples for implementing different configurations of embodiments of the invention. In order to simplify the disclosure of embodiments of the invention, the components and arrangements of specific examples are described below. Of course, they are merely examples and are not intended to limit the present invention. Furthermore, embodiments of the invention may repeat reference numerals and/or reference letters in the various examples, which have been repeated for purposes of simplicity and clarity and do not in themselves dictate a relationship between the various embodiments and/or arrangements discussed. In addition, embodiments of the present invention provide examples of various specific processes and materials, but one of ordinary skill in the art may recognize applications of other processes and/or use of other materials.
As shown in fig. 1-4, a real-time positioning and mapping method facing to an offshore water area includes the following steps:
the step A comprises the following steps:
step A1: acquiring data by a plurality of fisheye cameras, and then obtaining feature point normalization coordinates by resolving;
step A2: the method comprises the steps that an inertia measurement unit obtains inertia data PVQB of the unmanned ship;
step A3: sonar collects sonar point cloud data around the unmanned boat;
step A4: the Beidou satellite navigation system positions the current coordinate of the unmanned ship and calculates the position of a global plane coordinate system;
step A5: the magnetometer measures the measured value of the current magnetic field intensity and calculates the unmanned ship posture;
step A6: the barometer measures the air pressure so as to calculate the height of the unmanned ship;
and B:
after linear interpolation is carried out on the inertial data PVQB, the pose of the multi-view frame is obtained by combining the feature point normalized coordinates;
and C: and generating a grid map and a three-dimensional point cloud map by combining the multi-view frame position with the global plane coordinate system position, the unmanned ship posture, the unmanned ship height and the sonar point cloud data through loop detection and mapping.
According to the near-shore water area-oriented real-time positioning and mapping method, accurate real-time positioning can be achieved underwater through the fisheye camera, the Beidou satellite navigation system, the magnetometer, the barometer, the inertia measurement unit and the sonar, then data are processed through the loop detection mapping step, so that a grid-out map and a three-dimensional point cloud map suitable for the near-shore water area are constructed, unmanned ship pose estimation, real-time relocation and closed-loop detection can be achieved, and the purposes of path planning and self-service obstacle avoidance of the unmanned ship are achieved.
The plurality of fisheye cameras are arranged at the top of the unmanned boat, a single fisheye camera can provide 180-degree visual field, and the overlapped part of the visual fields is beneficial to acquiring accurate depth information of the characteristic points; the Beidou satellite navigation system is arranged on two sides of the unmanned boat; the sonar is arranged at the bottom of the unmanned boat; the inertia measurement unit, the barometer and the magnetometer are installed on the unmanned ship and arranged below the fisheye camera.
The real-time positioning and mapping method for the offshore water area can realize the self-adaptive synchronization of the timestamps of multiple sensors consisting of a fisheye camera, a Beidou satellite navigation system, a magnetometer, a barometer, an inertial measurement unit and a sonar, and solves the problem that the state estimation of the system is influenced by different timestamps generated by time delay, data transmission or an independent clock of each sensor.
In addition, the pose of the multi-view frame, Beidou satellite positioning data, magnetometer data and barometer data together construct a maximum likelihood estimation problem, and accurate global positioning can be achieved in an offshore environment. The maximum likelihood estimation problem is an algorithm with the maximum probability that the measurement values of all the sensors are known and the measurement values can be obtained under the condition of the pose.
The sonar is 360 sonars of ping, uses two 360 sonars of ping to construct grid map and three-dimensional point cloud map, can detect out the barrier that threatens to unmanned ship under the surface of water, carries out real-time map construction to the environment under water simultaneously, assists unmanned ship to realize autonomous navigation.
And the inertia data PVQB performs linear interpolation according to the time interval of the previous frame of picture and the current frame of picture and then combines the normalized coordinates of the characteristic points to obtain the multi-view frame pose.
In some embodiments, as shown in fig. 2, the step a1 specifically includes: firstly, preprocessing data of a plurality of fisheye cameras through a self-adaptive timestamp synchronization algorithm to form a multi-view frame;
then, extracting and tracking SuperPoint characteristic points of the multi-view frame to obtain pixel coordinates of each characteristic point of the current multi-view frame;
then, obtaining the normalized plane coordinates of the feature points through the processing of a camera projection model and a distortion model
Wherein Pf represents the three-dimensional coordinates of the feature point in the ith camera coordinate system, ZfZ-value, u, representing the three-dimensional coordinates of the feature point in the ith camera coordinate systemnAnd vnRepresenting the coordinates of the feature point in the ith camera pixel coordinate system.
The self-adaptive timestamp synchronization is an existing method, and comprises the following specific steps: firstly, a flag bit is set for each message channel, a callback function is used to mark a first flag bit when a message arrives, and the message is stored in a message cache queue; if the flag bit of each message channel is one, the latest message is determined as a key point, and then messages which are near the key point and have the time difference not exceeding a threshold value are searched in each message buffer queue.
SuperPoint is an algorithm for obtaining feature points based on an auto-supervised convolutional neural network. The camera projection model refers to an optical projection equation of an image on a CCD image plane. The distortion model is a physical equation that describes the distortion produced by the lens that projects light onto the camera lens.
For example, the step a2 specifically includes: the inertia measurement unit pre-integrates inertia data PVQB of the inertia measurement unit between the current moment and a previous moment multi-view frame through a fourth-order Runge-kutta method, wherein the inertia data PVQB comprises a transfer matrixVelocity vk+1Rotation matrixGravity biasAnd acceleration offset
wherein the content of the first and second substances,representing the measurement of a gyroscope in the inertial measurement unit;
said velocity vk+1The fourth-order Runge-kutta method is as follows:
wherein the content of the first and second substances,q=[qw qv]is a function that converts from quaternion q to rotation matrix R,the effect of (a) is to normalize the quaternion,representing the measurement value of an accelerometer in the inertial measurement unit, gwAn earth acceleration representing a current location;
the inertial measurement unit is a device for measuring the three-axis attitude angle or angular rate and acceleration of an object, and is mainly used for navigation. The three-axis three-dimensional attitude sensing device comprises three single-axis accelerometers and three single-axis gyroscopes, wherein the accelerometers are used for detecting acceleration signals of an object in independent three axes of a carrier coordinate system, and the gyroscopes are used for detecting angular velocity signals of the carrier relative to a navigation coordinate system, measuring the angular velocity and the acceleration of the object in a three-dimensional space and calculating the attitude of the object according to the angular velocity and the acceleration.
It is to be noted that, the step a3 specifically includes:
the sonar is solved through dataThen, the sound wave data is converted into sonar point cloud data and measurement noise is removed through the processing of a sonar motion distortion model;
wherein S iswaterThe propagation speed of sound in water, and T is measurement time;
the sonar motion distortion model is processed by the following steps:
wherein is tsTime stamp of sonar data, tsFall at time t0And t1In the above-mentioned manner,Gpb,tthe position of the unmanned ship under a world coordinate system G at the t-th moment is shown, r is distance information of sonar, alpha is an azimuth angle, epsilon is a pitch angle,Gps,tand the position of the sonar point cloud at the time t under the world coordinate system G is represented.
The sonar comprises a horizontal sonar and a vertical sonar, wherein the horizontal sonar is horizontally installed, the vertical sonar is vertically installed, the horizontal sonar is used for constructing a grid map, and the vertical sonar is used for constructing a three-dimensional point cloud map.
Optionally, the step a4 specifically includes: the Beidou satellite navigation system obtains coordinates under a longitude and latitude coordinate system through data resolving, and then converts the coordinates under the longitude and latitude coordinate system into current coordinates of the unmanned ship under a global plane coordinate system with the first frame of Beidou satellite signals as an origin:
according to the BDS north-seeking bucket satellite navigation system, the Beidou satellite navigation system consists of a space section, a ground section and a user section, and can provide high-precision, high-reliability positioning, navigation and time service for various users all day long and all day long in the global range, and the positioning precision is in the decimeter and centimeter level.
Specifically, in step a5, the magnetometer obtains the yaw angle of the unmanned ship in the global coordinate system through data calculation, so as to obtain the unmanned ship attitude, and a calculation formula is as follows:
wherein the content of the first and second substances,bMXYZis a measurement of a magnetometer that is,rotation matrix, M, representing the coordinate system of the unmanned ship to the global coordinate levelY′And MX′Is MXYZ′The two components of (a) and (b),wyaw' is the yaw angle of the unmanned boat in the global coordinate system;
the step a6 specifically includes: the barometer performs data calculation according to air pressure:
wherein, in the above formula, P is the air pressure measured by the barometer, PbFor reference to atmospheric pressure, TbIs a reference temperature, LbAs rate of temperature decrease, hbFor reference height, R is the gas constant, g0The gravity acceleration, M is the earth air molar mass,Bztthe height of the unmanned boat.
The magnetometer obtains the yaw angle of the unmanned ship under the global coordinate system through measurement and calculation, the yaw angle is the unmanned ship attitude, and the barometer obtains the height of the unmanned ship through measurement and calculation.
Preferably, as shown in fig. 3, in step B, the inertia data PVQB transfer matrixThe formula for performing linear interpolation is:
P(t)=(1-t)P0+tP1 t∈[0,1];
velocity v of the inertial data PVQBk+1The formula for performing linear interpolation is:
V(t)=(1-t)V0+tV1 t∈[0,1];
where Δ θ represents the angle between two quaternion vectors in euclidean space.
The linear interpolation enables the inertial data of the inertial measurement unit to be fully applied. Transfer matrix of the inertial data PVQBThe position of the inertial measurement unit under a world coordinate system; velocity v of the inertial data PVQBk+1Is used toThe speed of the sexual measurement unit under a world coordinate system; rotation matrix of the inertial data PVQBThe attitude of the inertial measurement unit under a world coordinate system; acceleration bias of the inertial data PVQBAnd gravity biasFor biasing the angular acceleration and angular velocity of the inertial measurement unit, the acceleration being biasedAnd gravity biasNo linear interpolation is required.
In some embodiments, in the step B, whether the system is initialized is detected;
when not initialized, detecting whether the excitation of the inertial measurement unit is sufficient; when the excitation meets a threshold value, performing Structure from Motion (SfM) on all multi-view frames in the sliding window to obtain a landmark point; solving the poses of other frames in the sliding window by using a PnP algorithm according to the landmark points, and recovering the three-dimensional coordinates of the landmark points seen by each frame in the sliding window by using a triangulation algorithm; aligning the multi-view frame in the sliding window with the inertial measurement unit so as to correct bias of the inertial measurement unit and speed, gravity and scale factors of each multi-view frame;
when the multi-view map frame is initialized, recovering the three-dimensional coordinates of the landmark points seen by the current multi-view map frame by using a triangulation algorithm; calculating a Jacobian matrix of the error of the inertial measurement unit relative to the optimized variable and a Jacobian matrix of the reprojection error of the multi-view frame; performing a Bundle Adjustment algorithm on all multi-view frames in the sliding window to obtain optimized multi-view frame poses, bias of a gyroscope, bias of an accelerometer, external parameters between each fisheye camera and an inertia measurement unit and inverse depth; calculating the prior constraint of the next sliding window optimization.
The system is a real-time positioning and mapping system facing to the near-shore water area. In the sliding window optimization, the positions and postures of the multi-view frames are estimated by using the visual constraints and the inertial measurement unit constraints of a plurality of fisheye cameras, and the positions and postures are subjected to closed-loop detection with the previous multi-view key frames, so that the problem of positioning failure of the unmanned ship can be solved, the accumulated errors of the positions and postures estimation of the multi-view frames can be eliminated, and accurate local positioning can be realized in the near-shore environment.
Calculating the Jacobian matrix of the error of the inertial measurement unit relative to the optimized variable and the Jacobian matrix of the reprojection error of the multi-view frame as follows: and forming a Hessian matrix by the prior constraint, the visual constraint and the inertial measurement unit constraint, and solving a nonlinear least square problem by using a DogLeg algorithm, thereby obtaining the poses of all the optimized frames in the sliding window, and also obtaining the bias of the inertial measurement unit, the bias of the accelerometer, the external parameters between each camera and the inertial measurement unit and the inverse depth. Wherein the Hessian matrix is a square matrix formed by second-order partial derivatives of a multivariate function and describes the local curvature of the function; the DogLeg algorithm is a non-linear optimization algorithm based on a trust domain.
The Bundle Adjustment algorithm is as follows: calculating prior constraint of next sliding window optimization, namely calculating the parallax of a second last frame and a third last frame in the sliding window, and if the parallax is less than a threshold value, marginalizing data of the second last frame; if the disparity is greater than the threshold, the data of the oldest frame is marginalized.
Structure from Motion (SfM) is: and searching multi-view frames with larger parallax between two frames in the sliding window, and recovering the first batch of road mark points by using an epipolar geometric algorithm.
The PnP algorithm is an algorithm for estimating the pose by knowing the three-dimensional space coordinates of n feature points and the two-dimensional projection positions thereof.
The triangulation method is specifically as follows, and if the feature point f is observed by the fisheye camera i and the fisheye camera j at the same time, the method comprises the following steps:whereinIs the expression form of the characteristic point f in the normalized plane in the fisheye camera i and the fisheye camera j, and the rotation matrixAnd a transfer matrixThe external parameter between the fisheye camera i and the fisheye camera j can be obtained by calibrating the fisheye cameras. Finally, both sides of the equation are simultaneously left-multiplied by [, [ 2 ]jpf]×The following can be obtained:
by solving the above equationiz andjz, i.e. depth information of the feature point f can be obtained.
For example, as shown in fig. 4, the step C specifically includes: firstly, detecting whether the position posture of a current multi-view frame is a key frame;
when the multi-view frame pose is a key frame, similarity calculation is carried out on the multi-view frame pose and a multi-view key frame database, when the score of the similarity calculation is larger than a threshold value, feature point matching is carried out on the multi-view frame pose and a closed-loop frame, and the relative pose between the multi-view frame pose and the closed-loop frame is calculated by using a PnP algorithm; fixing the oldest frame, the adding sequence edge and the closed loop edge in the multi-view key frame database, optimizing a pose graph with four degrees of freedom to form a multi-view key frame, and adding the multi-view key frame into the multi-view key frame database;
when the multi-view frame pose is not a key frame, the multi-view frame pose forms a multi-view non-key frame;
then, carrying out nonlinear maximum likelihood estimation on the multi-view key frame or the multi-view non-key frame in combination with the constraints of the global plane coordinate system position, the unmanned ship attitude and the unmanned ship height to obtain the pose of the unmanned ship, carrying out self-adaptive timestamp synchronization algorithm and pose association on the pose and the sonar point cloud data, then constructing a grid map by using a coverage grid map algorithm and constructing a three-dimensional point cloud map by using a down-sampling filter;
the overlay grid map algorithm is as follows:
wherein m isiRepresenting the state of the ith grid, z1:tRepresenting sonar point cloud data from time 1 to t, x1:tAnd (3) representing the unmanned boat pose from 1 to t.
The four degrees of freedom represent four degrees of freedom of position x, y, z and yaw angle yaw.
It should be noted that, in the step C, the constraint of the global plane coordinate system position of the beidou satellite navigation system is as follows:
wherein the content of the first and second substances,BDSZtshows the observed value of the t-time pose of the Beidou satellite navigation system,BDSht(X) represents an observation equation of the Beidou satellite navigation system at the t moment,wptthe position of the unmanned boat at the moment t in the global coordinate system is represented;
the constraints on the unmanned boat attitude of the magnetometer are:
wherein the content of the first and second substances,mht(X) represents the observation equation at the moment of magnetometer t,wyaw′tthe yaw angle of the unmanned ship under a world coordinate system after the magnetometer is subjected to data calculation at the time t,wyaw′tfor unmanned boat at time tAn estimated value of a yaw angle in a world coordinate system;
the constraint of the unmanned boat height of the barometer is as follows:
wherein the content of the first and second substances,BZtrepresents the observed value of the barometer at the moment t in the global coordinate system,Bht(X) represents the observation equation at time t of the barometer,wZtand the numerical value of the z axis of the global coordinate system at the moment t of the unmanned ship is shown.
The method can obtain global positioning with higher precision.
In the description herein, references to the description of the terms "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example" or "some examples" or the like mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and not to be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made in the above embodiments by those of ordinary skill in the art within the scope of the present invention.
Claims (10)
1. A real-time positioning and mapping method facing to an offshore water area is characterized in that: the method comprises the following steps:
the step A comprises the following steps:
step A1: acquiring data by a plurality of fisheye cameras, and then obtaining feature point normalization coordinates by resolving;
step A2: the method comprises the steps that an inertia measurement unit obtains inertia data PVQB of the unmanned ship;
step A3: sonar collects sonar point cloud data around the unmanned boat;
step A4: the Beidou satellite navigation system positions the current coordinate of the unmanned ship and calculates the position of a global plane coordinate system;
step A5: the magnetometer measures the measured value of the current magnetic field intensity and calculates the unmanned ship posture;
step A6: the barometer measures the air pressure so as to calculate the height of the unmanned ship;
and B:
after linear interpolation is carried out on the inertial data PVQB, the pose of the multi-view frame is obtained by combining the feature point normalized coordinates;
and C: and generating a grid map and a three-dimensional point cloud map by combining the multi-view frame position with the global plane coordinate system position, the unmanned ship posture, the unmanned ship height and the sonar point cloud data through loop detection and mapping.
2. The method for real-time positioning and mapping for near-shore water area of claim 1, wherein said step a1 is specifically: firstly, preprocessing data of a plurality of fisheye cameras through a self-adaptive timestamp synchronization algorithm to form a multi-view frame;
then, extracting and tracking SuperPoint characteristic points of the multi-view frame to obtain pixel coordinates of each characteristic point of the current multi-view frame;
then, obtaining the normalized plane coordinates of the feature points through the processing of a camera projection model and a distortion model
Wherein, PfRepresenting the three-dimensional coordinates, Z, of the feature point in the ith camera coordinate systemfZ-value, u, representing the three-dimensional coordinates of the feature point in the ith camera coordinate systemnAnd vnRepresenting the coordinates of the feature point in the ith camera pixel coordinate system.
3. The method for real-time positioning and mapping for near-shore water area of claim 2, wherein said step a2 is specifically:
the inertia measurement unit pre-integrates inertia data PVQB of the inertia measurement unit between the current moment and a previous moment multi-view frame through a fourth-order Runge-kutta method, wherein the inertia data PVQB comprises a transfer matrixVelocity vk+1Rotation matrixGravity biasAnd acceleration offset
wherein the content of the first and second substances,representing the measurement of a gyroscope in the inertial measurement unit;
said velocity vk+1The fourth-order Runge-kutta method is as follows:
wherein the content of the first and second substances,q=[qw qv]is a function that converts from quaternion q to rotation matrix R,the effect of (a) is to normalize the quaternion,representing the measurement value of an accelerometer in the inertial measurement unit, gwAn earth acceleration representing a current location;
4. the method for real-time positioning and mapping for near-shore water area of claim 3, wherein said step A3 is specifically:
the sonar is solved through dataThen, the sound wave data is converted into sonar point cloud data and measurement noise is removed through the processing of a sonar motion distortion model, wherein SwaterThe propagation speed of sound in water, and T is measurement time;
the sonar motion distortion model is processed by the following steps:
wherein is tsTime stamp of sonar data, tsFall at time t0And t1In the above-mentioned manner,denotes the t-thThe position of the unmanned ship under a world coordinate system G at the moment, r is distance information of sonar, alpha is an azimuth angle, epsilon is a pitch angle,Gps,tand the position of the sonar point cloud at the time t under the world coordinate system G is represented.
5. The method for real-time positioning and mapping for near-shore water area of claim 4, wherein said step A4 is specifically: the Beidou satellite navigation system obtains coordinates under a longitude and latitude coordinate system through data resolving, and then converts the coordinates under the longitude and latitude coordinate system into current coordinates of the unmanned ship under a global plane coordinate system with the first frame of Beidou satellite signals as an origin:
BDSpt=[BDSxt BDSyt BDSzt]T。
6. the method for real-time positioning and mapping for near-shore water areas of claim 5, wherein: in the step a5, the magnetometer obtains the yaw angle of the unmanned ship in the global coordinate system through data calculation, so as to obtain the unmanned ship attitude, and a calculation formula is as follows:
wherein the content of the first and second substances,bMXYZis a measurement of a magnetometer that is,rotation matrix, M, representing the coordinate system of the unmanned ship to the global coordinate levelY′And MX′Is MXYZ′The two components of (a) and (b),wyaw' is the yaw angle of the unmanned boat in the global coordinate system;
the step a6 specifically includes: the barometer performs data calculation according to air pressure:
wherein, in the above formula, P is the air pressure measured by the barometer, PbFor reference to atmospheric pressure, TbIs a reference temperature, LbAs rate of temperature decrease, hbFor reference height, R is the gas constant, g0The gravity acceleration, M is the earth air molar mass,Bztthe height of the unmanned boat.
7. The method for real-time positioning and mapping for near-shore water areas of claim 6, wherein: in step B, the transfer matrix of the inertial data PVQBThe formula for performing linear interpolation is:
P(t)=(1-t)P0+tP1 t∈[0,1];
velocity v of the inertial data PVQBk+1The formula for performing linear interpolation is:
V(t)=(1-t)V0+tV1 t∈[0,1];
where Δ θ represents the angle between two quaternion vectors in euclidean space.
8. The method for real-time positioning and mapping for near-shore water areas of claim 7, wherein: in the step B, whether the system is initialized or not is detected;
when not initialized, detecting whether the excitation of the inertial measurement unit is sufficient; when the excitation meets a threshold value, performing Structure from Motion (SfM) on all multi-view frames in the sliding window to obtain a landmark point; solving the poses of other frames in the sliding window by using a PnP algorithm according to the landmark points, and recovering the three-dimensional coordinates of the landmark points seen by each frame in the sliding window by using a triangulation algorithm; aligning the multi-view frame in the sliding window with the inertial measurement unit so as to correct bias of the inertial measurement unit and speed, gravity and scale factors of each multi-view frame;
when the multi-view map frame is initialized, recovering the three-dimensional coordinates of the landmark points seen by the current multi-view map frame by using a triangulation algorithm; calculating a Jacobian matrix of the error of the inertial measurement unit relative to the optimized variable and a Jacobian matrix of the reprojection error of the multi-view frame; performing a Bundle Adjustment algorithm on all multi-view frames in the sliding window to obtain optimized multi-view frame poses, bias of a gyroscope, bias of an accelerometer, external parameters between each fisheye camera and an inertia measurement unit and inverse depth; calculating the prior constraint of the next sliding window optimization.
9. The method for real-time positioning and mapping for an offshore water area according to claim 8, wherein the step C specifically comprises: firstly, detecting whether the position posture of a current multi-view frame is a key frame;
when the multi-view frame pose is a key frame, similarity calculation is carried out on the multi-view frame pose and a multi-view key frame database, when the score of the similarity calculation is larger than a threshold value, feature point matching is carried out on the multi-view frame pose and a closed-loop frame, and the relative pose between the multi-view frame pose and the closed-loop frame is calculated by using a PnP algorithm; fixing the oldest frame, the adding sequence edge and the closed loop edge in the multi-view key frame database, optimizing a pose graph with four degrees of freedom to form a multi-view key frame, and adding the multi-view key frame into the multi-view key frame database;
when the multi-view frame pose is not a key frame, the multi-view frame pose forms a multi-view non-key frame;
then, carrying out nonlinear maximum likelihood estimation on the multi-view key frame or the multi-view non-key frame in combination with the constraints of the global plane coordinate system position, the unmanned ship attitude and the unmanned ship height to obtain the pose of the unmanned ship, carrying out self-adaptive timestamp synchronization algorithm and pose association on the pose and the sonar point cloud data, then constructing a grid map by using a coverage grid map algorithm and constructing a three-dimensional point cloud map by using a down-sampling filter;
the overlay grid map algorithm is as follows:
wherein m isiRepresenting the state of the ith grid, z1:tRepresenting sonar point cloud data from time 1 to t, x1:tAnd (3) representing the unmanned boat pose from 1 to t.
10. The method for real-time positioning and mapping for near-shore water areas of claim 9, wherein: in the step C, the constraint of the global plane coordinate system position of the Beidou satellite navigation system is as follows:
wherein the content of the first and second substances,BDSZtshows the observed value of the t-time pose of the Beidou satellite navigation system,BDSht(X) represents an observation equation of the Beidou satellite navigation system at the t moment,wptthe position of the unmanned boat at the moment t in the global coordinate system is represented;
the constraints on the unmanned boat attitude of the magnetometer are:
wherein the content of the first and second substances,mht(X) represents the observation equation at the moment of magnetometer t,wyaw′tthe yaw angle of the unmanned ship under a world coordinate system after the magnetometer is subjected to data calculation at the time t,wyaw′tthe estimated value of the yaw angle of the unmanned ship under the world coordinate system at the moment t;
the constraint of the unmanned boat height of the barometer is as follows:
wherein the content of the first and second substances,BZtrepresents the observed value of the barometer at the moment t in the global coordinate system,Bht(X) represents the observation equation at time t of the barometer,wZtand the numerical value of the z axis of the global coordinate system at the moment t of the unmanned ship is shown.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011395614.6A CN112562052B (en) | 2020-12-03 | 2020-12-03 | Real-time positioning and mapping method for near-shore water area |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011395614.6A CN112562052B (en) | 2020-12-03 | 2020-12-03 | Real-time positioning and mapping method for near-shore water area |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112562052A true CN112562052A (en) | 2021-03-26 |
CN112562052B CN112562052B (en) | 2021-07-27 |
Family
ID=75047617
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011395614.6A Active CN112562052B (en) | 2020-12-03 | 2020-12-03 | Real-time positioning and mapping method for near-shore water area |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112562052B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113256697A (en) * | 2021-04-27 | 2021-08-13 | 武汉理工大学 | Three-dimensional reconstruction method, system and device of underwater scene and storage medium |
CN113340295A (en) * | 2021-06-16 | 2021-09-03 | 广东工业大学 | Unmanned ship near-shore real-time positioning and mapping method with multiple ranging sensors |
CN113744337A (en) * | 2021-09-07 | 2021-12-03 | 江苏科技大学 | Synchronous positioning and mapping method integrating vision, IMU and sonar |
CN114485613A (en) * | 2021-12-31 | 2022-05-13 | 海南浙江大学研究院 | Multi-information fusion underwater robot positioning method |
CN115560757A (en) * | 2022-09-01 | 2023-01-03 | 中国人民解放军战略支援部队信息工程大学 | Neural network-based unmanned aerial vehicle direct positioning correction method under random attitude error condition |
Citations (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170315247A1 (en) * | 2016-04-27 | 2017-11-02 | Sean Robert Griffin | Ship-towed hydrophone volumetric array system method |
CN108364319A (en) * | 2018-02-12 | 2018-08-03 | 腾讯科技(深圳)有限公司 | Scale determines method, apparatus, storage medium and equipment |
CN108648240A (en) * | 2018-05-11 | 2018-10-12 | 东南大学 | Based on a non-overlapping visual field camera posture scaling method for cloud characteristics map registration |
CN109816769A (en) * | 2017-11-21 | 2019-05-28 | 深圳市优必选科技有限公司 | Scene based on depth camera ground drawing generating method, device and equipment |
CN110223389A (en) * | 2019-06-11 | 2019-09-10 | 中国科学院自动化研究所 | The scene modeling method of blending image and laser data, system, device |
CN110309883A (en) * | 2019-07-01 | 2019-10-08 | 哈尔滨理工大学 | A kind of unmanned plane autonomic positioning method of view-based access control model SLAM |
CN110501017A (en) * | 2019-08-12 | 2019-11-26 | 华南理工大学 | A kind of Mobile Robotics Navigation based on ORB_SLAM2 ground drawing generating method |
CN110617814A (en) * | 2019-09-26 | 2019-12-27 | 中国科学院电子学研究所 | Monocular vision and inertial sensor integrated remote distance measuring system and method |
CN110692027A (en) * | 2017-06-05 | 2020-01-14 | 杭州零零科技有限公司 | System and method for providing easy-to-use release and automatic positioning of drone applications |
CN110726415A (en) * | 2019-10-21 | 2020-01-24 | 哈尔滨工程大学 | Self-adaptive underwater multi-beam synchronous positioning and mapping method |
US20200132456A1 (en) * | 2018-10-29 | 2020-04-30 | University Of New Hampshire | Apparatus and method for fault-proof collection of imagery for underwater survey |
CN111190424A (en) * | 2020-02-19 | 2020-05-22 | 苏州大学 | Integrated sensor for automatically positioning and establishing graph of indoor mobile robot |
CN111353198A (en) * | 2020-04-21 | 2020-06-30 | 大连理工大学 | Method for simulating surface appearance of flutter-free milling |
CN111415375A (en) * | 2020-02-29 | 2020-07-14 | 华南理工大学 | S L AM method based on multi-fisheye camera and double-pinhole projection model |
CN111458471A (en) * | 2019-12-19 | 2020-07-28 | 中国科学院合肥物质科学研究院 | Water area detection early warning method based on graph neural network |
CN111457902A (en) * | 2020-04-10 | 2020-07-28 | 东南大学 | Water area measuring method and system based on laser SLAM positioning |
CN111595334A (en) * | 2020-04-30 | 2020-08-28 | 东南大学 | Indoor autonomous positioning method based on tight coupling of visual point-line characteristics and IMU (inertial measurement Unit) |
CN111895988A (en) * | 2019-12-20 | 2020-11-06 | 北京空天技术研究所 | Unmanned aerial vehicle navigation information updating method and device |
CN111914686A (en) * | 2020-07-15 | 2020-11-10 | 云南电网有限责任公司带电作业分公司 | SAR remote sensing image water area extraction method, device and system based on surrounding area association and pattern recognition |
-
2020
- 2020-12-03 CN CN202011395614.6A patent/CN112562052B/en active Active
Patent Citations (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170315247A1 (en) * | 2016-04-27 | 2017-11-02 | Sean Robert Griffin | Ship-towed hydrophone volumetric array system method |
CN110692027A (en) * | 2017-06-05 | 2020-01-14 | 杭州零零科技有限公司 | System and method for providing easy-to-use release and automatic positioning of drone applications |
CN109816769A (en) * | 2017-11-21 | 2019-05-28 | 深圳市优必选科技有限公司 | Scene based on depth camera ground drawing generating method, device and equipment |
CN108364319A (en) * | 2018-02-12 | 2018-08-03 | 腾讯科技(深圳)有限公司 | Scale determines method, apparatus, storage medium and equipment |
CN108648240A (en) * | 2018-05-11 | 2018-10-12 | 东南大学 | Based on a non-overlapping visual field camera posture scaling method for cloud characteristics map registration |
US20200132456A1 (en) * | 2018-10-29 | 2020-04-30 | University Of New Hampshire | Apparatus and method for fault-proof collection of imagery for underwater survey |
CN110223389A (en) * | 2019-06-11 | 2019-09-10 | 中国科学院自动化研究所 | The scene modeling method of blending image and laser data, system, device |
CN110309883A (en) * | 2019-07-01 | 2019-10-08 | 哈尔滨理工大学 | A kind of unmanned plane autonomic positioning method of view-based access control model SLAM |
CN110501017A (en) * | 2019-08-12 | 2019-11-26 | 华南理工大学 | A kind of Mobile Robotics Navigation based on ORB_SLAM2 ground drawing generating method |
CN110617814A (en) * | 2019-09-26 | 2019-12-27 | 中国科学院电子学研究所 | Monocular vision and inertial sensor integrated remote distance measuring system and method |
CN110726415A (en) * | 2019-10-21 | 2020-01-24 | 哈尔滨工程大学 | Self-adaptive underwater multi-beam synchronous positioning and mapping method |
CN111458471A (en) * | 2019-12-19 | 2020-07-28 | 中国科学院合肥物质科学研究院 | Water area detection early warning method based on graph neural network |
CN111895988A (en) * | 2019-12-20 | 2020-11-06 | 北京空天技术研究所 | Unmanned aerial vehicle navigation information updating method and device |
CN111190424A (en) * | 2020-02-19 | 2020-05-22 | 苏州大学 | Integrated sensor for automatically positioning and establishing graph of indoor mobile robot |
CN111415375A (en) * | 2020-02-29 | 2020-07-14 | 华南理工大学 | S L AM method based on multi-fisheye camera and double-pinhole projection model |
CN111457902A (en) * | 2020-04-10 | 2020-07-28 | 东南大学 | Water area measuring method and system based on laser SLAM positioning |
CN111353198A (en) * | 2020-04-21 | 2020-06-30 | 大连理工大学 | Method for simulating surface appearance of flutter-free milling |
CN111595334A (en) * | 2020-04-30 | 2020-08-28 | 东南大学 | Indoor autonomous positioning method based on tight coupling of visual point-line characteristics and IMU (inertial measurement Unit) |
CN111914686A (en) * | 2020-07-15 | 2020-11-10 | 云南电网有限责任公司带电作业分公司 | SAR remote sensing image water area extraction method, device and system based on surrounding area association and pattern recognition |
Non-Patent Citations (8)
Title |
---|
MAXIME FERRERA 等: "AQUALOC: An underwater dataset for visual–inertial–pressure localization", 《THE INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH》 * |
何静: "水面机器人作业导航研究", 《中国优秀硕士学位论文全文数据库 基础科学辑》 * |
吴博: "基于IMU融合的视觉SLAM研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
周阳: "基于多传感器融合的移动机器人", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
张景东: "视觉SLAM中多鱼眼相机***研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
牛伯城: "基于声视觉的UUV水下同步定位与建图方法研究", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》 * |
范玉洁: "空间***制导与控制技术研究", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》 * |
黄鹤: "融合IMU的单目视觉SLAM方法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113256697A (en) * | 2021-04-27 | 2021-08-13 | 武汉理工大学 | Three-dimensional reconstruction method, system and device of underwater scene and storage medium |
CN113340295A (en) * | 2021-06-16 | 2021-09-03 | 广东工业大学 | Unmanned ship near-shore real-time positioning and mapping method with multiple ranging sensors |
CN113744337A (en) * | 2021-09-07 | 2021-12-03 | 江苏科技大学 | Synchronous positioning and mapping method integrating vision, IMU and sonar |
CN113744337B (en) * | 2021-09-07 | 2023-11-24 | 江苏科技大学 | Synchronous positioning and mapping method integrating vision, IMU and sonar |
CN114485613A (en) * | 2021-12-31 | 2022-05-13 | 海南浙江大学研究院 | Multi-information fusion underwater robot positioning method |
CN114485613B (en) * | 2021-12-31 | 2024-05-17 | 浙江大学海南研究院 | Positioning method for multi-information fusion underwater robot |
CN115560757A (en) * | 2022-09-01 | 2023-01-03 | 中国人民解放军战略支援部队信息工程大学 | Neural network-based unmanned aerial vehicle direct positioning correction method under random attitude error condition |
CN115560757B (en) * | 2022-09-01 | 2023-08-22 | 中国人民解放军战略支援部队信息工程大学 | Unmanned aerial vehicle direct positioning correction method based on neural network under random attitude error condition |
Also Published As
Publication number | Publication date |
---|---|
CN112562052B (en) | 2021-07-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112562052B (en) | Real-time positioning and mapping method for near-shore water area | |
CN113340295B (en) | Unmanned ship near-shore real-time positioning and mapping method with multiple ranging sensors | |
CN110243358B (en) | Multi-source fusion unmanned vehicle indoor and outdoor positioning method and system | |
US10295365B2 (en) | State estimation for aerial vehicles using multi-sensor fusion | |
Piniés et al. | Inertial aiding of inverse depth SLAM using a monocular camera | |
KR101192825B1 (en) | Apparatus and method for lidar georeferencing based on integration of gps, ins and image at | |
KR100761011B1 (en) | Aiding inertial navigation system using a camera type sun sensor and method there of | |
CN110044354A (en) | A kind of binocular vision indoor positioning and build drawing method and device | |
CN113551665B (en) | High-dynamic motion state sensing system and sensing method for motion carrier | |
CN115574816B (en) | Bionic vision multi-source information intelligent perception unmanned platform | |
CN114019552A (en) | Bayesian multi-sensor error constraint-based location reliability optimization method | |
JP3900365B2 (en) | Positioning device and positioning method | |
CN110068306A (en) | A kind of unmanned plane inspection photometry system and method | |
Karam et al. | Integrating a low-cost mems imu into a laser-based slam for indoor mobile mapping | |
JP5355443B2 (en) | Position correction system | |
CN112747749B (en) | Positioning navigation system based on binocular vision and laser fusion | |
CN116380079A (en) | Underwater SLAM method for fusing front-view sonar and ORB-SLAM3 | |
CN115930948A (en) | Orchard robot fusion positioning method | |
CN116026323A (en) | Positioning and regional error proofing method for engine oil filling machine | |
CN115542363A (en) | Attitude measurement method suitable for vertical downward-looking aviation pod | |
CN115290090A (en) | SLAM map construction method based on multi-sensor information fusion | |
CN114812554A (en) | Multi-source fusion robot indoor absolute positioning method based on filtering | |
CN115344033A (en) | Monocular camera/IMU/DVL tight coupling-based unmanned ship navigation and positioning method | |
CN117128951B (en) | Multi-sensor fusion navigation positioning system and method suitable for automatic driving agricultural machinery | |
Wei | Multi-sources fusion based vehicle localization in urban environments under a loosely coupled probabilistic framework |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |