CN114322996A - Pose optimization method and device of multi-sensor fusion positioning system - Google Patents

Pose optimization method and device of multi-sensor fusion positioning system Download PDF

Info

Publication number
CN114322996A
CN114322996A CN202011060481.7A CN202011060481A CN114322996A CN 114322996 A CN114322996 A CN 114322996A CN 202011060481 A CN202011060481 A CN 202011060481A CN 114322996 A CN114322996 A CN 114322996A
Authority
CN
China
Prior art keywords
pose
coordinate system
world coordinate
absolute
relative
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011060481.7A
Other languages
Chinese (zh)
Other versions
CN114322996B (en
Inventor
韩冰
张涛
边威
黄帅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN202011060481.7A priority Critical patent/CN114322996B/en
Publication of CN114322996A publication Critical patent/CN114322996A/en
Application granted granted Critical
Publication of CN114322996B publication Critical patent/CN114322996B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Navigation (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses a pose optimization method and device of a multi-sensor fusion positioning system. The method comprises the following steps: fusing the acquired image data shot by the vision sensor in a preset time period with inertial navigation measurement data to obtain a first position in a relative world coordinate system; and based on the first pose and the observation position and the observation course of the global positioning system in the absolute world coordinate system, performing iterative optimization on a second pose converted from the first pose to the absolute world coordinate system and a rotating external parameter between the relative world coordinate system and the absolute world coordinate system to obtain an optimized value of the second pose and an optimized value of the rotating external parameter, wherein the optimized value of the rotating external parameter is used for fusing image data shot by a visual sensor in the next time period with inertial navigation measurement data. The pose in the multi-sensor fusion positioning system can be quickly and accurately optimized.

Description

Pose optimization method and device of multi-sensor fusion positioning system
Technical Field
The invention relates to the technical field of high-precision positioning, in particular to a pose optimization method and a pose optimization device of a multi-sensor fusion positioning system.
Background
The multi-sensor fusion positioning System generally refers to a positioning System including a vision sensor, an Inertial Measurement Unit (IMU, abbreviated as "Inertial Navigation") and a Global Navigation Satellite System (GNSS), and has the advantages of high positioning accuracy and low cost, so that the multi-sensor fusion is an important choice for high-precision positioning.
The prior art for pose optimization of multi-sensor fusion positioning systems generally includes:
(1) VINS _ MONO algorithm
Only the accumulated error correction of loop detection is added, and the problem of accumulated drift of the positioning error exists.
(2) VI-ORB vision inertial navigation initialization algorithm
Local Bundle Adjustment is carried out on the basis of Pose Graph optimization, absolute position information of a global positioning system is not added at first, and if the absolute information is added, enough window size is needed, and at the moment, excessively large computing resources are consumed when Bundle Adjustment optimization is carried out.
(3) Traditional Pose Graph scheme
Only optimizing relative pose information and absolute position information of a global positioning system, wherein a world coordinate system rotation matrix is recovered through a latest frame, and when the fusion pose or the original observed value of the latest frame is not accurate, the world coordinate system rotation matrix is easily inaccurate, so that the optimization is unstable; and the initial map pose resolved by the inaccurate world system rotation matrix is also inaccurate, so that the algorithm is easy to fall into local optimization.
In summary, in the prior art, the pose optimization of the multi-sensor fusion positioning system has the problems of low precision such as error accumulation, local optimization and the like and long optimization time consumption.
Disclosure of Invention
In view of the above, the present invention has been made to provide a pose optimization method and apparatus for a multi-sensor fusion positioning system that overcomes or at least partially solves the above-mentioned problems.
In a first aspect, an embodiment of the present invention provides a pose optimization method for a multi-sensor fusion positioning system, including:
fusing the acquired image data shot by the vision sensor in a preset time period with inertial navigation measurement data to obtain a first position in a relative world coordinate system;
and based on the first pose and the observation position and the observation course of the global positioning system in the absolute world coordinate system, performing iterative optimization on a second pose converted from the first pose to the absolute world coordinate system and a rotating external parameter between the relative world coordinate system and the absolute world coordinate system to obtain an optimized value of the second pose and an optimized value of the rotating external parameter, wherein the optimized value of the rotating external parameter is used for fusing image data shot by a vision sensor in the next time period with inertial navigation measurement data.
In some optional embodiments, the iteratively optimizing the second pose converted from the first pose to the absolute world coordinate system and the external rotation parameter between the relative world coordinate system and the absolute world coordinate system to obtain the optimized value of the second pose and the optimized value of the external rotation parameter specifically includes:
and iteratively optimizing a second position, a second course angle and the rotating external parameter in the second pose to obtain an optimized value of the second position, an optimized value of the second course angle and an optimized value of the rotating external parameter.
In some optional embodiments, the iteratively optimizing, based on the first pose and the observed position and the observed heading of the global positioning system in the absolute world coordinate system, the transformation of the first pose to the second pose in the absolute world coordinate system and the rotational external reference between the relative world coordinate system and the absolute world coordinate system specifically includes:
according to the first pose and the observation position and the observation course of the global positioning system in the absolute world coordinate system, establishing at least one of the following residual errors, and combining all the established residual errors into a target residual error:
a relative position change residual between a first position in the first pose and a second position in a second pose where the first pose is transformed into an absolute world coordinate system;
a relative pose change residual between a first pose in the first poses and a second pose in which the first pose is transformed to an absolute world coordinate system;
an absolute pose transition residual for the first pose;
absolute position residual error of an observation position of the global positioning system under an absolute world coordinate system;
and the absolute attitude residual error of the observation course of the global positioning system under the absolute world coordinate system.
In some alternative embodiments, the relative position change residual is established by:
converting the difference value of the first positions of the shooting moments of two adjacent frames of images into a first difference value under the inertial navigation coordinate system of the shooting moment of the previous frame of image according to the rotation external parameters of the inertial navigation coordinate system of the shooting moment of the previous frame of image in the two frames and the relative world coordinate system;
determining a second difference value under the inertial navigation coordinate system at the shooting time of the previous frame of image according to the difference value of a second position in a second pose to be optimized at the shooting time of the two frames of images and a second posture in a second pose at the shooting time of the previous frame of image;
determining a difference between the first difference and the second difference as a relative position change residual.
In some optional embodiments, the relative pose change residual is established by:
and determining a relative posture change residual error according to a second posture in a second posture to be optimized at the shooting time of two adjacent frames of images and the rotation external parameters of the inertial navigation coordinate system and the relative world coordinate system at the shooting time of the two frames of images respectively.
In some optional embodiments, the absolute pose change residual for the first pose is established by:
and determining the absolute attitude change residual error of the first attitude according to the rotation external parameters of the inertial navigation coordinate system and the relative world coordinate system, the second attitude in the second attitude to be optimized and the rotation external parameters between the relative world coordinate system and the absolute world coordinate system.
In some optional embodiments, an absolute position residual of an observed position of the global positioning system in the absolute world coordinate system is established by:
and determining the absolute position residual error of the observation position of the global positioning system under the absolute world coordinate system according to the observation position of the global positioning system under the absolute world coordinate system, the second position and the second posture in the second pose to be optimized and the rotation external parameters between the inertial navigation coordinate system and the absolute world coordinate system.
In some optional embodiments, the absolute pose residual of the global positioning system's observed heading in the absolute world coordinate system is established by:
and determining the absolute attitude residual error of the observation course of the global positioning system under the absolute world coordinate system according to the observation course of the global positioning system under the absolute world coordinate system and the second attitude in the second attitude to be optimized.
In some optional embodiments, before performing iterative optimization on the second pose converted from the first pose to the absolute world coordinate system and the external parameter of rotation between the relative world coordinate system and the absolute world coordinate system based on the first pose and the observed position and the observed heading of the global positioning system in the absolute world coordinate system, the method further includes:
judging whether the fusion times of the image data shot by the visual sensor and the inertial navigation measurement data reach preset times or not; or the like, or, alternatively,
and judging whether the frame number corresponding to the currently acquired first attitude reaches a preset frame number.
In a second aspect, an embodiment of the present invention provides a pose optimization apparatus for a multi-sensor fusion positioning system, including:
the fusion module is used for fusing the acquired image data shot by the vision sensor in the preset time period with inertial navigation measurement data to obtain a first position in a relative world coordinate system;
and the optimization module is used for performing iterative optimization on a second pose converted from the first pose to the absolute world coordinate system and a rotating external parameter between the relative world coordinate system and the absolute world coordinate system according to the first pose obtained by the fusion module and the observed position and the observed course of the global positioning system in the absolute world coordinate system to obtain an optimized value of the second pose and an optimized value of the rotating external parameter, wherein the optimized value of the rotating external parameter is used for fusing image data shot by the vision sensor in the next time period with inertial navigation measurement data.
In a third aspect, an embodiment of the present invention provides a computer-readable storage medium, on which computer instructions are stored, and when the instructions are executed by a processor, the method for optimizing the pose of the multi-sensor fusion positioning system is implemented.
In a fourth aspect, an embodiment of the present invention provides a server, including: the system comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein the processor executes the program to realize the pose optimization method of the multi-sensor fusion positioning system.
According to the pose optimization method of the multi-sensor fusion positioning system, the acquired image data shot by the vision sensor in the preset time period is fused with the inertial navigation measurement data, and a first pose under a relative world coordinate system is obtained; and based on the first pose and the observation position and the observation course of the global positioning system in the absolute world coordinate system, performing iterative optimization on a second pose converted from the first pose to the absolute world coordinate system and a rotating external parameter between the relative world coordinate system and the absolute world coordinate system to obtain an optimized value of the second pose and an optimized value of the rotating external parameter, wherein the optimized value of the rotating external parameter is used for fusing image data shot by a visual sensor in the next time period with inertial navigation measurement data. The beneficial effects of the technical scheme at least comprise:
(1) an inner layer Visual Inertial Odometer (VIO) performs fusion of image data shot by a Visual sensor and Inertial navigation measurement data to obtain a first position relative to a world coordinate system, an outer layer introduces a global positioning system to take an observation position and an observation course under the world coordinate system as constraints, and optimizes to obtain an optimized value of a second position and a rotating external parameter on the basis of the first position, so that the problem of inaccurate precision caused by local optimization is solved; the optimized value of the rotating external parameter is input into the inner layer VIO of the next round for fusion, so that the precision of the first pose output by the inner layer VIO is improved, and the problem of unstable optimization caused by inaccurate estimation of the rotating external parameter between a relative world coordinate system and an absolute world coordinate system is solved; and the outer layer introduces the observation position and the observation course of the global positioning system under the world coordinate system as constraints, so that the problem of error accumulation caused by drift of the inner layer can be solved, and the precision of the second attitude is improved.
(2) The inner-layer VIO is fused into tight coupling, so that the optimized parameters and residual errors are large, and the calculated amount is large; the pose of the outer layer is optimized to be loose coupling, the optimized parameters of each frame are few, the residual errors are relatively few, and the calculated amount is small. Therefore, the double-layer optimization scheme with the separated inner layer and the outer layer reduces the calculation amount compared with the case that all optimization work is completed in the inner layer.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings:
FIG. 1 is a flowchart of a pose optimization method of a multi-sensor fusion positioning system according to an embodiment of the present invention;
FIG. 2 is a diagram illustrating a pose optimization method of a multi-sensor fusion positioning system according to an embodiment of the present invention;
FIG. 3 is a flowchart illustrating a specific implementation of a target residual error establishing method according to a second embodiment of the present invention;
FIG. 4 is a flowchart illustrating an implementation of step S31 in FIG. 3;
FIG. 5 is a diagram illustrating the principle of factor graph optimization according to an embodiment of the present invention;
fig. 6 is a schematic structural view of a pose optimization device of the multi-sensor fusion positioning system in the embodiment of the invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
In order to solve the problems of low pose optimization precision and long time consumption of a multi-sensor fusion positioning system in the prior art, the embodiment of the invention provides a pose optimization method and a pose optimization device of the multi-sensor fusion positioning system, which can quickly and accurately perform pose optimization on the multi-sensor fusion positioning system and have small calculated amount.
Example one
The embodiment of the invention provides a pose optimization method of a multi-sensor fusion positioning system, the flow of which is shown in figure 1, and the method comprises the following steps:
step S11: and fusing the acquired image data shot by the vision sensor in the preset time period with inertial navigation measurement data to obtain a first attitude in a relative world coordinate system.
According to the acquired shooting time of each frame of image shot by the vision sensor, after time difference alignment is carried out on the inertial navigation measurement data and the image data, the inertial navigation measurement data and the image data are fused to obtain a first position posture in a world coordinate system, specifically, the first position posture can be an inertial navigation posture or a vision sensor posture, and the first position posture and the vision sensor posture can be flexibly converted through rotating external parameters between the inertial navigation and the vision sensor.
Specifically, the first location includes values relative to x, y, and z coordinates in a world coordinate system; the first attitude includes a heading angle Yaw, a Roll angle Roll, and a Pitch angle Pitch.
The image data and the inertial navigation measurement data can be fused by the prior art, and the embodiment of the specific fusion method is not limited.
Step S12: and based on the first pose and the observation position and the observation course of the global positioning system in the absolute world coordinate system, performing iterative optimization on a second pose converted from the first pose to the absolute world coordinate system and a rotating external parameter between the relative world coordinate system and the absolute world coordinate system to obtain an optimized value of the second pose and an optimized value of the rotating external parameter.
The relative world coordinate system can also be understood as an initial world coordinate system which is a world coordinate system before attitude optimization, and more specifically can be a coordinate system before course angle optimization; and the absolute world coordinate system is a coordinate system after the attitude optimization.
In one embodiment, the method may include iteratively optimizing a second position and a second heading angle in the second pose and the rotating external parameter to obtain an optimized value of the second position, an optimized value of the second heading angle and an optimized value of the rotating external parameter.
The method can be characterized in that at least one residual error is established according to the first pose and the observation position and the observation course of the global positioning system in the absolute world coordinate system, and the established residual errors are combined into a target residual error.
(1) A relative position change residual error between a first position in the first pose and a second position in a second pose converted from the first pose to the absolute world coordinate system;
(2) a relative pose change residual error between a first pose in the first poses and a second pose in a second pose converted from the first poses to an absolute world coordinate system;
(3) an absolute pose transform residual for the first pose;
(4) absolute position residual error of an observation position of the global positioning system under an absolute world coordinate system;
(5) and the absolute attitude residual error of the observation course of the global positioning system under the absolute world coordinate system.
The global positioning system, specifically, the global navigation satellite system, may be any satellite navigation system, such as a GPS or a beidou satellite navigation system.
The established target residual error comprises a second pose converted from the first pose to the absolute world coordinate system and a rotating external parameter between the relative world coordinate system and the absolute world coordinate system.
The specific determination method of each residual is described in detail in the following embodiment two.
The optimized value of the rotation external parameter is used for fusing image data shot by the vision sensor in the next time period with inertial navigation measurement data, so that the optimization of the first pose is more accurate.
Specifically, the termination condition of the optimization may be that the target residual reaches a set condition, and may be that the target residual is smaller than a set residual threshold; or the target residual error is smaller than a set residual error threshold value and the iteration times reach preset times; alternatively, other preset iteration optimization termination conditions may be used.
According to the pose optimization method of the multi-sensor fusion positioning system, the acquired image data shot by the vision sensor in the preset time period is fused with the inertial navigation measurement data, and a first pose under a relative world coordinate system is obtained; and based on the first pose and the observation position and the observation course of the global positioning system in the absolute world coordinate system, performing iterative optimization on a second pose converted from the first pose to the absolute world coordinate system and a rotating external parameter between the relative world coordinate system and the absolute world coordinate system to obtain an optimized value of the second pose and an optimized value of the rotating external parameter, wherein the optimized value of the rotating external parameter is used for fusing image data shot by a visual sensor in the next time period with inertial navigation measurement data. The beneficial effects of the technical scheme at least comprise:
(1) an inner layer Visual Inertial Odometer (VIO) performs fusion of image data shot by a Visual sensor and Inertial navigation measurement data to obtain a first position relative to a world coordinate system, an outer layer introduces a global positioning system to take an observation position and an observation course under the world coordinate system as constraints, and optimizes to obtain an optimized value of a second position and a rotating external parameter on the basis of the first position, so that the problem of inaccurate precision caused by local optimization is solved; the optimized value of the rotating external parameter is input into the inner layer VIO of the next round for fusion, so that the precision of the first pose output by the inner layer VIO is improved, and the problem of unstable optimization caused by inaccurate estimation of the rotating external parameter between a relative world coordinate system and an absolute world coordinate system is solved; and the outer layer introduces the observation position and the observation course of the global positioning system under the world coordinate system as constraints, so that the problem of error accumulation caused by drift of the inner layer can be solved, and the precision of the second attitude is improved.
(2) The inner-layer VIO is fused into tight coupling, so that the optimized parameters and residual errors are large, and the calculated amount is large; the pose of the outer layer is optimized to be loose coupling, the optimized parameters of each frame are few, the residual errors are relatively few, and the calculated amount is small. Therefore, the double-layer optimization scheme with the separated inner layer and the outer layer reduces the calculation amount compared with the case that all optimization work is completed in the inner layer.
Referring to fig. 2, the above steps can be summarized as follows: on the basis of the fusion of the inner layer of the VIO, namely, the Measurement data of a visual sensor (such as Camera) and the Measurement data of an Inertial Measurement Unit (IMU, Inertial navigation for short) are combined with the speed and longitude and latitude information measured by a global positioning system (such as a GPS) to carry out first position and attitude estimation (Local Estimator) in a relative world coordinate system; and then, taking the first Pose (Local Position) as input, combining longitude and latitude information measured by a Global positioning system (such as GPS), performing Pose optimization (Global Position Graph optimization) under an outer absolute world coordinate system, outputting the Pose (Global Position) under the absolute world coordinate system, and simultaneously returning a rotating external parameter (Word Frame R) between the relative world coordinate system and the absolute world coordinate system to a Local Estimator for VIO fusion in the next time period. The constraint of the measurement data of a global positioning system (such as a GPS) solves the problem of inner layer VIO drift, realizes global optimization and improves the pose accuracy.
In an embodiment, before performing step S12, determining whether the number of times of fusing the image data captured by the vision sensor with the inertial navigation measurement data reaches a preset number of times; or judging whether the frame number corresponding to the currently acquired first posture reaches a preset frame number.
Image data shot by a vision sensor and inertial navigation measurement data are fused, namely VIO is fused into tight coupling, optimized parameters and residual errors are large, the calculated amount is large, so that inner layer VIO is fused into a small window, and the number of frames optimized each time is small; the pose optimization of the outer layer, namely the optimization of the second pose, is loose coupling, parameters and residual errors optimized by each frame are relatively small, the calculated amount is small, the outer layer is a large window, and the number of frames optimized each time is large. Therefore, the outer layer can be optimized again according to the result of the multiple times of inner layer fusion after the multiple times of inner layer fusion.
If yes, go to step S12; if not, the image data and the inertial navigation measurement data captured by the vision sensor in the next time period are continuously acquired, and step S11 is executed.
Example two
The second embodiment of the present invention provides a specific implementation of a method for establishing a target residual error to be optimized in a pose optimization process of a multi-sensor fusion positioning system, and the flow of the method is shown in fig. 3, and the method includes the following steps:
step S31: a relative position change residual of a first position in the first pose and a second position in the second pose is established.
Referring to fig. 4, the method comprises the following steps:
step S311: and converting the difference value of the first positions of the shooting moments of the two adjacent frames of images into a first difference value under the inertial navigation coordinate system of the shooting moment of the previous frame of image according to the rotation external parameters of the inertial navigation coordinate system of the shooting moment of the previous frame of image in the two frames and the relative world coordinate system.
Step S312: and determining a second difference value under the inertial navigation coordinate system at the shooting time of the previous frame of image according to the difference value of the second position in the second pose to be optimized at the shooting time of the two frames of images and the second pose in the second pose at the shooting time of the previous frame of image.
Step S313: determining a difference between the first difference and the second difference as a relative position change residual.
Specifically, a relative position change residual between a first position in the first pose and a second position in the second pose may be represented by the following formula:
Figure BDA0002712201960000101
wherein the content of the first and second substances,
Figure BDA0002712201960000102
the relative position change residual error of the first position at the kth frame time and the kth +1 frame time is obtained;
Figure BDA0002712201960000103
the rotational external parameters of an inertial navigation coordinate system and a relative world coordinate system at the kth frame time are obtained;
Figure BDA0002712201960000104
the first position of the kth frame time is the position of inertial navigation under a relative world coordinate system;
Figure BDA0002712201960000105
the second position of the kth frame time, namely the position of the inertial navigation to be optimized under an absolute world coordinate system;
Figure BDA0002712201960000106
the second posture at the kth frame moment is the posture of the inertial navigation to be optimized under the absolute world coordinate system; k is 0,1 … … n, and n +1 is the total number of frames of the image.
The k-th frame time in the above and the following is the shooting time when the vision sensor shoots the k-th frame image.
Step S32: a relative pose change residual is established for a first pose in the first pose and a second pose in the second pose.
In one embodiment, the method can include determining a relative pose change residual according to a second pose in a second pose to be optimized at the shooting time of two adjacent frames of images and the rotation external parameters of the inertial navigation coordinate system and the relative world coordinate system at the shooting time of the two frames of images respectively. The relative posture of the first posture in the first posture and the second posture in the second posture is changedChange the residual error
Figure BDA0002712201960000111
May be of the formula:
see the following formula:
Figure BDA0002712201960000112
step S33: and establishing an absolute posture conversion residual error of the first posture.
In one embodiment, determining an absolute pose change residual for the first pose based on the rotational external reference between the inertial navigation coordinate system and the relative world coordinate system, the second pose in the second pose to be optimized, and the rotational external reference between the relative world coordinate system and the absolute world coordinate system may be included.
See the following formula:
Figure BDA0002712201960000113
wherein the content of the first and second substances,
Figure BDA0002712201960000114
an absolute attitude change residual error of the first attitude at the kth frame time;
Figure BDA0002712201960000115
is the rotating external parameter between the relative world coordinate system and the absolute world coordinate system to be optimized.
Step S34: and establishing an absolute position residual error of an observed position of the global positioning system under an absolute world coordinate system.
In one embodiment, the method may include determining an absolute position residual of the observed position of the global positioning system in the absolute world coordinate system based on the observed position of the global positioning system in the absolute world coordinate system, the second position and the second pose in the second pose to be optimized, and a rotational external reference between the inertial navigation coordinate system and the absolute world coordinate system.
See the following formula:
Figure BDA0002712201960000116
wherein the content of the first and second substances,
Figure BDA0002712201960000117
the absolute position residual error of the observation position of the global positioning system at the k frame moment under the absolute world coordinate system is obtained;
Figure BDA0002712201960000118
the method comprises the following steps that (1) a rotation external parameter between an absolute world coordinate system and an inertial navigation coordinate system, namely a relative position external parameter between an antenna of a global positioning system and inertial navigation;
Figure BDA0002712201960000119
the observation position of the global positioning system at the k frame time under the absolute world coordinate system is obtained.
Step S35: and establishing an absolute attitude residual error of an observed course of the global positioning system under an absolute world coordinate system.
In one embodiment, the method may include determining an absolute pose residual of the observed heading of the global positioning system in the absolute world coordinate system based on the observed heading of the global positioning system in the absolute world coordinate system and a second pose in a second pose to be optimized.
Due to the problems of inaccurate Z value optimization of VIO, unsmooth Z value observed by a global positioning system and the like, the Z value is easily coupled with the Roll angle Roll and the Pitch angle Pitch, and the Roll and Pitch estimation is inaccurate. Therefore, the integral rotation optimization can be decomposed on line, only the course angle Yaw in the attitude is optimized, and then the attitude combination is carried out by combining the Pitch angle Pitch and the Roll angle Roll in the VIO result, so that the problem of inaccurate estimation of Roll and Pitch caused by error coupling is solved, and the attitude optimization precision is improved. Therefore, the absolute attitude residual of the global positioning system in the observation attitude of the absolute world coordinate system can be established as follows:
Figure BDA0002712201960000121
wherein the content of the first and second substances,
Figure BDA0002712201960000122
the absolute attitude residual error of the observation course of the global positioning system at the kth frame time under the absolute world coordinate system;
Figure BDA0002712201960000123
the heading angle of the kth frame moment in the second posture under the absolute world coordinate system;
Figure BDA0002712201960000124
is the observed heading (angle) of the global positioning system at the k frame time under the absolute world coordinate system.
To sum up, the parameters to be optimized in the residual are:
Figure BDA0002712201960000125
Figure BDA0002712201960000126
second pose of each frame in the residual, e.g. second pose of kth frame
Figure BDA0002712201960000127
Only need to optimize
Figure BDA0002712201960000128
The steps S31 to S35 have no sequence, and any one or more steps may be executed first, or may be executed simultaneously.
Step S36: and combining the built residuals into a target residual.
The combined target residuals x may be:
Figure BDA0002712201960000131
wherein, Ω pkA first position covariance at a kth frame time; omega phikA first pose covariance at a kth frame time; omega phi vkAbsolute pose covariance of the first pose at the kth frame time; omega phi wkThe absolute attitude covariance of the global positioning system at the kth frame moment is the absolute attitude covariance of the observation course of the global positioning system under an absolute world coordinate system; omega pwkThe absolute position covariance of the global positioning system at the kth frame moment is the absolute position covariance of the observation position of the global positioning system under an absolute world coordinate system; ρ (x) represents the elimination of outliers in x.
The embodiment of the invention provides a method for determining a pose based on 4DOF and absolute attitude constraint factor graph optimization, and particularly relates to X, y and z in the position of a 4DOF finger and a course angle in the pose, wherein the absolute attitude constraint means that observation data of a global positioning system under an absolute world coordinate system is used as constraint, the factor graph optimization principle is shown in figure 5, wherein X1-X6 represent first poses (subscript represents frame number) of each frame under a relative world coordinate system obtained by fusing image data shot by an inner layer vision sensor and inertial navigation measurement data, and R represents the frame numberww0For a rotation external parameter between a relative world coordinate system and an absolute world coordinate system to be optimized, a factor (i) represents a relative position change residual error of a first position in a first pose between two adjacent frames and a relative pose change residual error of the first pose, a factor (ii) represents an absolute pose conversion residual error of the first pose, and a factor (iii) represents an absolute position residual error of an observed position of a global positioning system in the absolute world coordinate system and an absolute pose residual error of an observed heading. Iterative optimization of the total factors formed by the factors can realize conversion from the pose in a relative world coordinate system (Local Coordination) to the pose in an absolute world coordinate system (Global Coordination).
Based on the inventive concept of the present invention, an embodiment of the present invention further provides a pose optimization apparatus for a multi-sensor fusion positioning system, which has a structure as shown in fig. 6, and includes:
the fusion module 61 is configured to fuse the acquired image data captured by the vision sensor within the preset time period with inertial navigation measurement data to obtain a first position in a world coordinate system;
and the optimization module 62 is configured to perform iterative optimization on a second pose converted from the first pose to the absolute world coordinate system and a rotation external parameter between the relative world coordinate system and the absolute world coordinate system according to the first pose obtained by the fusion module 61 and the observed position and the observed heading of the global positioning system in the absolute world coordinate system, so as to obtain an optimized value of the second pose and an optimized value of the rotation external parameter, where the optimized value of the rotation external parameter is used for fusing image data shot by the vision sensor in a next time period with inertial navigation measurement data.
In an embodiment, the optimization module 62 performs iterative optimization on the second pose converted from the first pose to the absolute world coordinate system and the external rotation parameter between the relative world coordinate system and the absolute world coordinate system to obtain the optimized value of the second pose and the optimized value of the external rotation parameter, and is specifically configured to:
and iteratively optimizing a second position, a second course angle and the rotating external parameter in the second pose to obtain an optimized value of the second position, an optimized value of the second course angle and an optimized value of the rotating external parameter.
In one embodiment, the optimization module 62 is configured to perform iterative optimization on the first pose converted to the second pose in the absolute world coordinate system and the rotational external parameter between the relative world coordinate system and the absolute world coordinate system based on the first pose and the observed position and the observed heading of the global positioning system in the absolute world coordinate system, and is specifically configured to:
according to the first pose and the observation position and the observation course of the global positioning system in the absolute world coordinate system, establishing at least one of the following residual errors, and combining all the established residual errors into a target residual error:
a relative position change residual between a first position in the first pose and a second position in a second pose where the first pose is transformed into an absolute world coordinate system; a relative pose change residual between a first pose in the first poses and a second pose in which the first pose is transformed to an absolute world coordinate system; an absolute pose transition residual for the first pose; absolute position residual error of an observation position of the global positioning system under an absolute world coordinate system; and the absolute attitude residual error of the observation course of the global positioning system under the absolute world coordinate system.
In one embodiment, the apparatus further includes a determining module 63, and before the optimizing module 62 performs iterative optimization on the first pose converted to the second pose in the absolute world coordinate system and the external rotation parameter between the relative world coordinate system and the absolute world coordinate system based on the first pose and the observed position and the observed heading of the global positioning system in the absolute world coordinate system, the determining module 63 is configured to:
judging whether the fusion times of the image data shot by the visual sensor and the inertial navigation measurement data reach preset times or not; or judging whether the frame number corresponding to the currently acquired first posture reaches a preset frame number.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Based on the inventive concept of the present invention, an embodiment of the present invention further provides a computer-readable storage medium, on which computer instructions are stored, and when the instructions are executed by a processor, the method for optimizing the pose of the multi-sensor fusion positioning system is implemented.
Based on the inventive concept of the present invention, an embodiment of the present invention further provides a server, including: the system comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein the processor executes the program to realize the pose optimization method of the multi-sensor fusion positioning system.
Unless specifically stated otherwise, terms such as processing, computing, calculating, determining, displaying, or the like, may refer to an action and/or process of one or more processing or computing systems or similar devices that manipulates and transforms data represented as physical (e.g., electronic) quantities within the processing system's registers and memories into other data similarly represented as physical quantities within the processing system's memories, registers or other such information storage, transmission or display devices. Information and signals may be represented using any of a variety of different technologies and techniques. For example, data, instructions, commands, information, signals, bits, symbols, and chips that may be referenced throughout the above description may be represented by voltages, currents, electromagnetic waves, magnetic fields or particles, optical fields or particles, or any combination thereof.
It should be understood that the specific order or hierarchy of steps in the processes disclosed is an example of exemplary approaches. Based upon design preferences, it is understood that the specific order or hierarchy of steps in the processes may be rearranged without departing from the scope of the present disclosure. The accompanying method claims present elements of the various steps in a sample order, and are not intended to be limited to the specific order or hierarchy presented.
In the foregoing detailed description, various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments of the subject matter require more features than are expressly recited in each claim. Rather, as the following claims reflect, invention lies in less than all features of a single disclosed embodiment. Thus, the following claims are hereby expressly incorporated into the detailed description, with each claim standing on its own as a separate preferred embodiment of the invention.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. Of course, the storage medium may also be integral to the processor. The processor and the storage medium may reside in an ASIC. The ASIC may reside in a user terminal. Of course, the processor and the storage medium may reside as discrete components in a user terminal.
For a software implementation, the techniques described herein may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. The software codes may be stored in memory units and executed by processors. The memory unit may be implemented within the processor or external to the processor, in which case it can be communicatively coupled to the processor via various means as is known in the art.
What has been described above includes examples of one or more embodiments. It is, of course, not possible to describe every conceivable combination of components or methodologies for purposes of describing the aforementioned embodiments, but one of ordinary skill in the art may recognize that many further combinations and permutations of various embodiments are possible. Accordingly, the embodiments described herein are intended to embrace all such alterations, modifications and variations that fall within the scope of the appended claims. Furthermore, to the extent that the term "includes" is used in either the detailed description or the claims, such term is intended to be inclusive in a manner similar to the term "comprising" as "comprising" is interpreted when employed as a transitional word in a claim. Furthermore, any use of the term "or" in the specification of the claims is intended to mean a "non-exclusive or". The terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.

Claims (11)

1. A pose optimization method of a multi-sensor fusion positioning system comprises the following steps:
fusing the acquired image data shot by the vision sensor in a preset time period with inertial navigation measurement data to obtain a first position in a relative world coordinate system;
and based on the first pose and the observation position and the observation course of the global positioning system in the absolute world coordinate system, performing iterative optimization on a second pose converted from the first pose to the absolute world coordinate system and a rotating external parameter between the relative world coordinate system and the absolute world coordinate system to obtain an optimized value of the second pose and an optimized value of the rotating external parameter, wherein the optimized value of the rotating external parameter is used for fusing image data shot by a vision sensor in the next time period with inertial navigation measurement data.
2. The method according to claim 1, wherein the iteratively optimizing the second pose converted from the first pose to the absolute world coordinate system and the rotating external parameter between the relative world coordinate system and the absolute world coordinate system to obtain the optimized value of the second pose and the optimized value of the rotating external parameter includes:
and iteratively optimizing a second position, a second course angle and the rotating external parameter in the second pose to obtain an optimized value of the second position, an optimized value of the second course angle and an optimized value of the rotating external parameter.
3. The method of claim 1, wherein the iterative optimization of the transformation of the first pose to the second pose in the absolute world coordinate system and the rotational external reference between the relative world coordinate system and the absolute world coordinate system based on the first pose and the observed position and the observed heading of the global positioning system in the absolute world coordinate system comprises:
according to the first pose and the observation position and the observation course of the global positioning system in the absolute world coordinate system, establishing at least one of the following residual errors, and combining all the established residual errors into a target residual error:
a relative position change residual between a first position in the first pose and a second position in a second pose where the first pose is transformed into an absolute world coordinate system;
a relative pose change residual between a first pose in the first poses and a second pose in which the first pose is transformed to an absolute world coordinate system;
an absolute pose transition residual for the first pose;
absolute position residual error of an observation position of the global positioning system under an absolute world coordinate system;
and the absolute attitude residual error of the observation course of the global positioning system under the absolute world coordinate system.
4. The method of claim 3, wherein the relative position change residual is established by:
converting the difference value of the first positions of the shooting moments of two adjacent frames of images into a first difference value under the inertial navigation coordinate system of the shooting moment of the previous frame of image according to the rotation external parameters of the inertial navigation coordinate system of the shooting moment of the previous frame of image in the two frames and the relative world coordinate system;
determining a second difference value under the inertial navigation coordinate system at the shooting time of the previous frame of image according to the difference value of a second position in a second pose to be optimized at the shooting time of the two frames of images and a second posture in a second pose at the shooting time of the previous frame of image;
determining a difference between the first difference and the second difference as a relative position change residual.
5. The method of claim 3, wherein the relative pose change residuals are created by:
and determining a relative posture change residual error according to a second posture in a second posture to be optimized at the shooting time of two adjacent frames of images and the rotation external parameters of the inertial navigation coordinate system and the relative world coordinate system at the shooting time of the two frames of images respectively.
6. The method of claim 3, the absolute pose change residual for the first pose being established by:
and determining the absolute attitude change residual error of the first attitude according to the rotation external parameters of the inertial navigation coordinate system and the relative world coordinate system, the second attitude in the second attitude to be optimized and the rotation external parameters between the relative world coordinate system and the absolute world coordinate system.
7. The method of claim 3, wherein the absolute position residual of the global positioning system's observed position in the absolute world coordinate system is established by:
and determining the absolute position residual error of the observation position of the global positioning system under the absolute world coordinate system according to the observation position of the global positioning system under the absolute world coordinate system, the second position and the second posture in the second pose to be optimized and the rotation external parameters between the inertial navigation coordinate system and the absolute world coordinate system.
8. The method of claim 3, wherein the absolute pose residual of the global positioning system's observed heading in the absolute world coordinate system is established by:
and determining the absolute attitude residual error of the observation course of the global positioning system under the absolute world coordinate system according to the observation course of the global positioning system under the absolute world coordinate system and the second attitude in the second attitude to be optimized.
9. The method of any one of claims 1 to 8, before performing iterative optimization on the second pose converted from the first pose to the absolute world coordinate system and the external rotation parameter between the relative world coordinate system and the absolute world coordinate system based on the first pose and the observed position and the observed heading of the global positioning system in the absolute world coordinate system, further comprising:
judging whether the fusion times of the image data shot by the visual sensor and the inertial navigation measurement data reach preset times or not; or the like, or, alternatively,
and judging whether the frame number corresponding to the currently acquired first attitude reaches a preset frame number.
10. A pose optimization device of a multi-sensor fusion positioning system comprises:
the fusion module is used for fusing the acquired image data shot by the vision sensor in the preset time period with inertial navigation measurement data to obtain a first position in a relative world coordinate system;
and the optimization module is used for performing iterative optimization on a second pose converted from the first pose to the absolute world coordinate system and a rotating external parameter between the relative world coordinate system and the absolute world coordinate system according to the first pose obtained by the fusion module and the observed position and the observed course of the global positioning system in the absolute world coordinate system to obtain an optimized value of the second pose and an optimized value of the rotating external parameter, wherein the optimized value of the rotating external parameter is used for fusing image data shot by the vision sensor in the next time period with inertial navigation measurement data.
11. A computer readable storage medium having stored thereon computer instructions which, when executed by a processor, implement the pose optimization method of the multi-sensor fusion positioning system of any one of claims 1 to 9.
CN202011060481.7A 2020-09-30 2020-09-30 Pose optimization method and device of multi-sensor fusion positioning system Active CN114322996B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011060481.7A CN114322996B (en) 2020-09-30 2020-09-30 Pose optimization method and device of multi-sensor fusion positioning system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011060481.7A CN114322996B (en) 2020-09-30 2020-09-30 Pose optimization method and device of multi-sensor fusion positioning system

Publications (2)

Publication Number Publication Date
CN114322996A true CN114322996A (en) 2022-04-12
CN114322996B CN114322996B (en) 2024-03-19

Family

ID=81011228

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011060481.7A Active CN114322996B (en) 2020-09-30 2020-09-30 Pose optimization method and device of multi-sensor fusion positioning system

Country Status (1)

Country Link
CN (1) CN114322996B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113375665A (en) * 2021-06-18 2021-09-10 西安电子科技大学 Unmanned aerial vehicle pose estimation method based on multi-sensor elastic coupling

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110012827A1 (en) * 2009-07-14 2011-01-20 Zhou Ye Motion Mapping System
WO2016187757A1 (en) * 2015-05-23 2016-12-01 SZ DJI Technology Co., Ltd. Sensor fusion using inertial and image sensors
CN107869989A (en) * 2017-11-06 2018-04-03 东北大学 A kind of localization method and system of the fusion of view-based access control model inertial navigation information
CN109029433A (en) * 2018-06-28 2018-12-18 东南大学 Join outside the calibration of view-based access control model and inertial navigation fusion SLAM on a kind of mobile platform and the method for timing
CN109993113A (en) * 2019-03-29 2019-07-09 东北大学 A kind of position and orientation estimation method based on the fusion of RGB-D and IMU information
CN110514225A (en) * 2019-08-29 2019-11-29 中国矿业大学 The calibrating external parameters and precise positioning method of Multi-sensor Fusion under a kind of mine
CN110706279A (en) * 2019-09-27 2020-01-17 清华大学 Global position and pose estimation method based on information fusion of global map and multiple sensors
WO2020087846A1 (en) * 2018-10-31 2020-05-07 东南大学 Navigation method based on iteratively extended kalman filter fusion inertia and monocular vision
CN111207774A (en) * 2020-01-17 2020-05-29 山东大学 Method and system for laser-IMU external reference calibration

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110012827A1 (en) * 2009-07-14 2011-01-20 Zhou Ye Motion Mapping System
WO2016187757A1 (en) * 2015-05-23 2016-12-01 SZ DJI Technology Co., Ltd. Sensor fusion using inertial and image sensors
CN107869989A (en) * 2017-11-06 2018-04-03 东北大学 A kind of localization method and system of the fusion of view-based access control model inertial navigation information
CN109029433A (en) * 2018-06-28 2018-12-18 东南大学 Join outside the calibration of view-based access control model and inertial navigation fusion SLAM on a kind of mobile platform and the method for timing
WO2020087846A1 (en) * 2018-10-31 2020-05-07 东南大学 Navigation method based on iteratively extended kalman filter fusion inertia and monocular vision
CN109993113A (en) * 2019-03-29 2019-07-09 东北大学 A kind of position and orientation estimation method based on the fusion of RGB-D and IMU information
CN110514225A (en) * 2019-08-29 2019-11-29 中国矿业大学 The calibrating external parameters and precise positioning method of Multi-sensor Fusion under a kind of mine
CN110706279A (en) * 2019-09-27 2020-01-17 清华大学 Global position and pose estimation method based on information fusion of global map and multiple sensors
CN111207774A (en) * 2020-01-17 2020-05-29 山东大学 Method and system for laser-IMU external reference calibration

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113375665A (en) * 2021-06-18 2021-09-10 西安电子科技大学 Unmanned aerial vehicle pose estimation method based on multi-sensor elastic coupling
CN113375665B (en) * 2021-06-18 2022-12-02 西安电子科技大学 Unmanned aerial vehicle pose estimation method based on multi-sensor elastic coupling

Also Published As

Publication number Publication date
CN114322996B (en) 2024-03-19

Similar Documents

Publication Publication Date Title
CN109297510B (en) Relative pose calibration method, device, equipment and medium
CN112268559B (en) Mobile measurement method for fusing SLAM technology in complex environment
CN108592950B (en) Calibration method for relative installation angle of monocular camera and inertial measurement unit
CN102289804B (en) System and method for three dimensional video stabilisation by fusing orientation sensor readings with image alignment estimates
CN107167826B (en) Vehicle longitudinal positioning system and method based on variable grid image feature detection in automatic driving
US20110292166A1 (en) North Centered Orientation Tracking in Uninformed Environments
CN110187375A (en) A kind of method and device improving positioning accuracy based on SLAM positioning result
CN112781586B (en) Pose data determination method and device, electronic equipment and vehicle
CN110081881A (en) It is a kind of based on unmanned plane multi-sensor information fusion technology warship bootstrap technique
Niu et al. Development and evaluation of GNSS/INS data processing software for position and orientation systems
CN112835085B (en) Method and device for determining vehicle position
CN113933818A (en) Method, device, storage medium and program product for calibrating laser radar external parameter
CN103512584A (en) Navigation attitude information output method, device and strapdown navigation attitude reference system
CN110032201A (en) A method of the airborne visual gesture fusion of IMU based on Kalman filtering
CN112835086B (en) Method and device for determining vehicle position
CN116184430B (en) Pose estimation algorithm fused by laser radar, visible light camera and inertial measurement unit
CN112577493A (en) Unmanned aerial vehicle autonomous positioning method and system based on remote sensing map assistance
CN114396943A (en) Fusion positioning method and terminal
CN110779514B (en) Hierarchical Kalman fusion method and device for auxiliary attitude determination of bionic polarization navigation
CN112129321A (en) Gyro zero offset calibration value determining method and device and computer storage medium
US20220057517A1 (en) Method for constructing point cloud map, computer device, and storage medium
CN114322996B (en) Pose optimization method and device of multi-sensor fusion positioning system
CN112907663A (en) Positioning method, computer program product, device and system
CN109470269B (en) Calibration method, calibration equipment and calibration system for space target measuring mechanism
CN116481543A (en) Multi-sensor fusion double-layer filtering positioning method for mobile robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant