GB2597632A - RGBD sensor and IMU sensor-based positioning method - Google Patents

RGBD sensor and IMU sensor-based positioning method Download PDF

Info

Publication number
GB2597632A
GB2597632A GB2116617.8A GB202116617A GB2597632A GB 2597632 A GB2597632 A GB 2597632A GB 202116617 A GB202116617 A GB 202116617A GB 2597632 A GB2597632 A GB 2597632A
Authority
GB
United Kingdom
Prior art keywords
sensor
keyframe
coordinate system
rgb
imu
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2116617.8A
Other versions
GB202116617D0 (en
Inventor
Liu Yu
Wu Yilin
Luo Jiaxiang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Guangzhou Institute of Modern Industrial Technology
Original Assignee
South China University of Technology SCUT
Guangzhou Institute of Modern Industrial Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT, Guangzhou Institute of Modern Industrial Technology filed Critical South China University of Technology SCUT
Publication of GB202116617D0 publication Critical patent/GB202116617D0/en
Publication of GB2597632A publication Critical patent/GB2597632A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0248Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/027Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising intertial navigation means, e.g. azimuth detector
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/579Depth or shape recovery from multiple images from motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Navigation (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

An RGBD sensor and IMU sensor-based positioning method, which is characterized in that: firstly, a mobile robot carries an RGBD sensor and an IMU sensor to collect data; then, a visual SLAM algorithm is used to initialize data of the RGBD sensor so as to obtain pose information thereof, at the same time data of the IMU sensor is pre-integrated, and the mobile robot is initialized to acquire initial state information thereof; tracking of the pose information, speed, and IMU sensor bias of the mobile robot is optimized on the basis of a re-projection error; meanwhile, a sliding window-based nonlinear optimization algorithm is used to optimize pose information and a map in real time; and loopback detection is added to avoid a situation of zero bias occurring.

Description

LOCATING METHOD BASED ON RGB-D SENSOR AND IMU SENSOR
CROSS REFERENCE TO RELATED APPLICATION
[0001] The present application claims priority to a Chinese Patent Application No. 202010166801.0, filed on March 11, 2020, entitled "LOCAELNG METHOD BASED ON RGB-D SENSOR AND IMU SENSOR", and the entire content of which is incorporated herein by reference.
TECHNICAL FIELD
[0002] The present application relates to a technical field of a fusion of a visual simultaneous localization and mapping (SLANT) and a sensor, and in particular, to a locating method based on a red, green, blue, and depth (RGB-D) sensor and an inertial measurement unit (IMU) sensor.
BACKGROUND
[0003] SLAM is short for simultaneous localization and mapping, first proposed by Smith Self and Cheeseman in 1988, and is considered to be the key to realize a truly fully autonomous mobile robot. The application of SLAM techniques can be described as that: a robot starts moving from an unknown location in an unknown environment locates itself based on location estimation and data acquired by sensors during the moving, and builds incremental maps at the same time. The SLAM that only obtains environmental information via a camera is called a visual SLAM. Since the camera has characteristics of low price, low power consumption, light weight, small size, rich image information and the like, the visual SLAM has been studied by a large number of scholars and experts at home and abroad.
[0004] An IMU is an inertial measurement unit. An IMU sensor will providea three-axis gyroscope and a three-directional accelerometer to measure angular velocity and acceleration of an object in three-dimensional space, and calculate a posture of the object.
The IMU sensor is commonly used in a device that requires motion control, such as vehicles and robots.
[0005] Although a visual SLAM algorithm can estimate its own pose in the three-dimensional space via the data of a camera sensor, the visual SLAM algorithm is easy to miss feature points that are tracked during fast motion or pure rotation. In addition, the visual SLAM generally takes a first frame as a world coordinate system, such that the estimated pose is relative to a pose of the first frame of image, not the world coordinate system such as local East, North, Up (ENU) Cartesian coordinate system. The IMU sensor can provide pose at each time. However, there is a zero-drift problem during long-term motion, resulting in deviations in the estimated pose. A loop closure in the visual SLAM can solve this problem well. Therefore, the fusion of the visual SLAM and the IMU sensor can estimate the pose of the robot in the unknown space well.
SUMMARY
[0006] According to embodiments of the present application, a locating method based on a RGB-D sensor and an IMU sensor is provided.
[0007] The objective of the present application is achieved by the following technical solutions.
[0008] A locating method based on a RGB-D sensor and an IMU sensor is provided. The locating method is applicable to a mobile robot mounted with the RGB-D sensor and the IMU sensor, for locating the mobile robot. The locating method includes the following steps.
[0009] Si, acquiring data via the RGB-D sensor and the IMU sensor The mobile robot mounted with the RGB-D sensor and the IMU sensor freely moves in a three-dimensional space, and the RGB-D sensor and the IMU sensor acquire data, respectively.
[0010] 52, performing an IMU pre-integration. A keyframe is selected by using a visual SLAM algorithm, according to quality of different frames of image data acquired by the RGB-D sensor. Based on the selected keyframe, a pre-integration of the IMU sensor between keyframes is calculated through an IMU pre-integration equation.
[0011] 53, initializing the mobile robot A pose of the RGB-D sensor is obtained by initializing the RGB-D sensor using the visual SLAM algorithm, and then an initialized state of the mobile robot is obtained by initializing the mobile robot according to the pose of the RGB-D sensor and the 1MU pre-integration of the corresponding keyframe. The state of the mobile robot includes a pose, a velocity of the mobile robot, and a bias of the IMU sensor.
100121 S4, performing a tracking optimization on a state of the mobile robot. The tracking optimization is performed on the estimated state of the mobile robot by using an optimization algorithm based on reprojection error, and then a nonlinear optimization is performed on the state of the mobile robot by using an optimization algorithm based on slide window.
[0013] 55, performing a loop detection. According to the data acquired by the RGB-D sensor, a loop optimization is performed on the pose of the RGB-D sensor and feature points by using a loop optimization algorithm based on a bag-of-words model, and then an error term of the EMU sensor is optimized by using an optimization algorithm based on slide window.
[0014] Further, an acquiring frequency of the IMU sensor is greater than an acquiring frequency of the RGB-D sensor. The acquiring frequency of the IMU sensor is above 100Hz. The acquiring frequency of the RGB-D sensor is about 30Hz. Since the acquiring frequency of the IMU sensor is much greater than the acquiring frequency of the RGB-D sensor, the data is required to be aligned through BTU pre-integration.
[0015] Further, a rule of using the visual SLAM algorithm to select the keyframe in the step 52 is as follows: the keyframe is selected according to a frame interval between a current image frame and a previous keyframe acquired by the RGB-D sensor, and the number of common visual feature points between the current image frame and the previous keyframe.
[0016] Further, the MU pre-integration is calculated in the step S2, an acceleration and an angular velocity is measured by the IME sensor: çôi(t) = w (t) + b. (t) + ng(t) la(t) = (RwB)T(a(0-g") + b + 71a(t) where w(t) and a(t) represent true values of the angular velocity and the acceleration respectively, -1;3(0 and a(t) represent values of the angular velocity and the acceleration measured by the IMU sensor respectively, R' represents a rotation matrix when a coordinate system of the IMU sensor is transformed to a world coordinate system, b9 (t) and b" (t) represent a gyroscope bias and an accelerometer bias of the IMU sensor over time respectively, ng (t) and ?I'M represent gaussian white noise of a gyroscope and an accelerometer of the IMU sensor over time respectively, and my represents a gravity vector in the world coordinate system.
[0017] The values measured by the IMU sensor is integrated, and then is discretized according to the acquiring frequency, to obtain: {R7B, RrB n "-" ii Exp ((wk -14 -t4d)At) v j = vi +9w + At + E-k.1 ARik(ak -br, -nrcd) at p j = pi + Eik_li[vkAt + -12 gAt2 + -12 ARik(ak -blic--n't,d)At21 where bk9 and bit, represent the gyroscope bias and the accelerometer bias of the IMU sensor at time k respectively, nr and jrI represent gaussian white noise of the gyroscope and the accelerometer of the IMU sensor after being discretized respectively, Alta, represents rotation matrixes from time i to time k, ak and wk represent discrete values of the acceleration and the angular velocity measured by the IMU sensor respectively, m, vi and, Rrs represent a displacement, a velocity, and a rotation matrix of the IMU sensor after being discretized at the time i respectively, g" represents the gravity vector in the world coordinate system, and At represents discrete time difference.
[0018] Further, in order to prevent a problem that a variation in data at the previous time causes a re-derivation at the next time, a pre-integration value between the two times is calculated, and the accelerometer bias and gyroscope bias of the IMU sensor are separated. A state at the next time is calculated from a state at a current time through the IMTJ pre-integration: R1:1,11. = 121'1' ARij+1Exp(PLbn = vi + gwAti,i+i + RrB (avi,i+i ± Ig 12.
vbf +"bli) pill = pi + viAti j+, + thAqi+i + RrE (Api,i+, + Jgpbr + IgplV) where Rr+-11. represents a rotation matrix of a coordinate system B of the IMU sensor when rotated to a world coordinate system at a time i + 1, rill and p1+1 represent a velocity and a pose of the coordinate system B of the IMU sensor at the time i + 1 in the world coordinate system respectively, At" represents an interval from time i to the time i + 1 of the IMU sensor, and br represent a gyroscope bias and an accelerometer bias of the IMU sensor at the time i respectively, gw represents gravitational acceleration in the world coordinate system, ARt,i+i, Av+1 and Api,i+i represent pre-integrations of the rotation matrix, a velocity transformation, and a displacement transformation of the IMU sensor from the time i to the time i + 1 respectively, JgH, igu and jg represent first-order approximation Jacobian matrixes when AR, Av and Lip change with a gyroscope bias big of the IMTJ sensor respectively, and jgt, and jg, represent first-order approximation Jacobian matrixes when Ay and Lip change with an accelerometer bias b of the 1MU sensor respectively.
[0019] Further, it is necessary to initialize the pose of the RGB-D sensor, the biases and the velocity of the IMU sensor, the gravity vector, and the velocity, and the step S3 includes steps of: 100201 S31, using the visual SLAM algorithm to initialize the RGB-D sensor, to obtain the pose of the RGB-D sensor, which includes steps of 100211 S311, using an ORB feature extraction algorithm to extract ORB feature points of image data acquired by the RGB-D sensor, and finding out three-dimensional space points corresponding to the ORB feature points according to depth data of the RGB-D sensor, thereby obtaining the pose of the RGB-D sensor according to pixel coordinates of the ORB feature points and three-dimensional coordinates of the space points; [0022] 5312, for a space point Xk = , its pixel coordinate in a coordinate system of the RGB-D sensor being uk = [xk,yk], and a correlation between the coordinates of the space points and the pixel coordinates being: suk = Kexp(e)Xk, where s represents a scale factor of a depth map, K represents an intrinsic parameter matrix of the RGB-D sensor, e. represents an estimated pose, e" represents an antisymmetric matrix corresponding to the pose, and k represents a serial number of the feature point; and [0023] S313, optimizing the pose of the RGB-D sensor by constructing a least square relation, to minimize an error: = argmin -2 Er=15.1(exp(r)X k11 ' where represents an optimal pose after being optimized, represents the estimated pose, and rt represents the number of feature points; 100241 532, selecting consecutive N keyframes according to the visual SLAM algorithm, using the IMU pre-integration equation to integrate gyroscope data of the IMU sensor between the consecutive N keyframes, to obtain a rotation transformation of the IMU sensor between the N keyframes, and estimating a gyroscope bias 199 of the IMU sensor by establishing a minimum error equation of rotation transformations of the RGB-D sensor and the EMU sensor between the Nkeyframes: argmin V_ALog,b9)) R13,w,R11111 g where N represents the number of keyframes, R = Rrc f?' : RP.' represents a rotation of the keyframe i as a rotation of the coordinate system of the RGB-D sensor relative to a world coordinate system obtained by the visual SLAM algorithm, 4,14; represents a rotation matrix when the world coordinate system is transformed to a coordinate system of the EMU sensor at the keyframe i + 1, ReB represents a calibrated rotation matrix when a calibrated EMU sensor is transformed to the coordinate system of the RGB-D sensor, ARi,i+i represents a rotation matrix obtained through the IMU pre-integration between two consecutive keyframes, and it? represents a first-order approximation Jacobian matrix when AR changes with the gyroscope bias of the IMU sensor; and [0025] S33, estimating an accelerometer bias and a gravity vector of the IMU sensor by using an estimated result of the gyroscope bias b9 of the IMU sensor, which includes the following steps.
[0026] 5331, introducing a gravity constant G, and standardizing a gravity vector 14/ = {0,0, -1} in the coordinate system of the IMU sensor and a gravity vector gw* measured yw by the EMU sensor through gt, -Ila;11' to calculate a rotation matrix RliTI: = Exp(i10) , where Rwl represents a direction of the 1MU sensor in the world coordinate system, - an'U -arctan OW, x u"IL:4, * Ay) , 1141 x.aw II where i; represents a rotation axis of the gravity of the IMU sensor and the gra N, vector measured by the IMU sensor, and 0 represents a rotation angle thereof [0027] The gravity vector of the EMU sensor may be represented as: [0028] A gravity disturbance vector 80 is added: = R'Exp(80)thG, where 80 = [SO xy, (Wry = [50x 819AT 1000101 first-order approximation to gw: g"RtAg1 G -Rwl (fly) GS0where (gi)" represents a first-order approximation of thc gravity vcctor yl [0029] According to a transformation between the coordinate system C of the RGB-D sensor and the coordinate system B of the TMU sensor: Pr = + [0030] According to an IMU sensor pre-integration calculating equation, the following equation may be established: pf+, = pf + viat11+1-R1/411 (th),<GAqi+,450 + Rwi thGati:1±, + (RP' -Rin)pic + RrB (o.+, ± igpba) , where Jgp represents a first-order approximation Jacobian matrix when Ap changes with the gyroscope bias of the IMU sensor; , 74. represents coordinates of a center of the coordinate system of the IMU sensor in the coordinate system of the RGB-D sensor, pr represents coordinates of a center of the coordinate system of the RGB-D sensor in the world coordinate system at a kay frame i, vi represents a velocity of the IMU sensor at the keyframe i, Rr represents a rotation matrix of the coordinate system of the RGB-D sensor relative to the world coordinate at the keyframe i, RrB represents a tY rotation matrix of the coordinate system of the IMU sensor relative to the world coordinate system at the keyframe i, and Atuil represents a time interval from the keyframe i to keyframe i + 1 of the IMU sensor.
100311 According to recurrence relation of the 1MU pre-integration equation between three consecutive keyframes, and a transformation equation between the coordinate system of the IMU sensor and the coordinate system of the RGB-D sensor, a linear equation may be obtained: [(TM /3( A BOXY] = ba where a(i) = RRwiudoxGotLAt2,-kat"A33)] P0) = Rty2jg"At12At23 + Rry4,23At12-Rryzn2At23, and r(i) = (pf -/4)&12 + (pf -pf)6' t23 + t12At23R G (A ti2 + A t23) - (RWC RWC),C A t (RWC RWCA CA at2124t12At23 R2W8 Ap23AtI2 2 it-"B1-*'23 WB k 2 3 "ipB ti2 + R, RrAp12At23, where 80,y represents a second-order vector in 80 =[ orY1, SO represents a gravity deviation vector, ba represents an initialized accelerometer bias of the IMU sensor, Rwi represents a direction of the 1MU sensor in the world coordinate system, At12 and At23 represent time intervals from keyframes 1, 2 to keyframes 2, 3 of the IMU sensor respectively, G represents a reference value of gravity, 'al represents a gravity vector in the coordinate system of the IMU sensor, [](:,1:2) represents that only data of first two columns of a matrix is included, RJR and Rr3 represent rotation matrixes of the coordinate system of the IMU sensor relative to the world coordinate system at the keyframes I and 2 respectively, Ap12 and Ap23 represent displacements of the IMU sensor between the keyframes 1, 2 to the keyframes 2, 3 respectively, Av23 represents a variation in a velocity from the keyframe 2 to the keyframe 3, igpr, and Jgpiz represent first-order approximation Jacobian matrixes when Ap23, Ap23 and Ap12 change with the accelerometer bias of the LMU sensor respectively, Rfc, Ter and Cc represent rotation matrixes of the coordinate system C of the RGB-D sensor relative to the world coordinate system at the keyframes 1, 2, and 3 respectively, pf, ,pg and pg represent coordinates of a center of the coordinate system of the RGB-D sensor in the world coordinate system at the keyframes 1, 2, and 3 respectively, and pg represents coordinates of a center of the coordinate system of the IMU sensor in the coordinate system of the RGB-D sensor.
[0032] 5332, solving a gravity deviation second-order vector SOxy and the accelerometer bias ba. through singular value decomposition.
[0033] 5333, calculating a velocity v of the IMU sensor as an initialized velocity of the IMU sensor through an equation: viAto" = pg. --iRw'CO,),<GAti2,1"80 +1 Rwl 1G At i2.1" (Rric -2 2 RIV±Dpic +(ap i.i" +igpbti.) where jg, represents a first-order approximation Jacobian matrix when Ap changes with the gyroscope bias of the EMU sensor, 74 represents coordinates of the center of the coordinate system of the IMU sensor in the coordinate system of the RGB-D sensor, pr represents coordinates of the center of the coordinate system of the RGB-D sensor in the world coordinate system at the keyframe vi represents the velocity of the IMU sensor at the keyframe i, Rre represents a rotation matrix of the coordinate system of the RGB-D sensor relative to the world coordinate at the keyframe i, Rr represents a rotation matrix of the coordinate system of the EMU sensor relative to the world coordinate system at the keyframe i, and Att,t÷i represents a time interval from the keyframe i to the keyframe i + 1 of the IMU sensor.
[0034] Further, the state of the mobile robot includes the velocity, the pose of the mobile robot, and the biases of the IMU sensor. The step S4 includes steps of: [0035] 841, estimating a pose of the RGB-D sensor by using the visual SLAM algorithm; [0036] 542, establishing a constraint relation between keyframes, and establishing a least square relation based on a reprojection error, according to an error term of the LMU sensor and the pose of the mobile robot: 100371 Epro i) represents the reprojection error of the pose and the feature points, and is expressed as follows: Eproj(k, j) = (Suk -Kexp(C)XDTXk(suk -Kexp(")XE) ,and 4 = RC, Ryw or pi) + p 15, 100381 Eimu(i,j) represents the error term of the IMU sensor, and is expressed as follows: Eimu(i,j) = [der efti[e7R-e7; enT + (4EReb) ,
T
eR = Log ((AR1 ExpOgRbn\Rrw R yvB) , e = Rr1/2 -vr - -(Avid +pg"by +A:tub!) ep=Rr, (7,7_0 _ vr gwAto_ (Am./ +gpfry + Jgp by), and eb = hi -bi, where k represents a sequence number of a feature point of an image, i represents a sequence number of the keyframe, j represents a sequence number of a current keyframe, s represents a scale factor of a depth map, uk represents two-dimensional coordinates of a le1 feature point in a camera coordinate system of the RGB-D sensor, K represents an intrinsic parameter matrix of the RGB-D sensor, e.A represents an antisymmetric matrix of the pose e. of the RGB-D sensor, Aq: represents coordinates of the kth feature point Xk in a coordinate system C of the RGB-D sensor, Yk represents a scale information matrix of the feature point 4 and a corresponding pixel point, R' represents a calibrated rotation matrix when a calibrated IMU sensor is transformed to the coordinate system of the RGB-D sensor, Rfw represents a rotation matrix of a world coordinate system relative to a coordinate system of the IMU sensor at the keyframe j, .y) represents coordinates of the V' feature point 4 in the world coordinate system W, pl represents coordinates of the IMU sensor in the world coordinate system at the keyframe j, pi represents coordinates of a center of the coordinate system of the IMU sensor in the coordinate system of the RGB-D sensor, aki,Avii and Apo represent pre-integrations of a rotation matrix, a velocity transformation, and a displacement transformation of the IMU sensor from the keyframe i to the keyframe j respectively, jgR, ":" and tg, represent first-order approximation Jacobian matrixes when AR, At' and Lip change with a gyroscope bias of the IMU sensor respectively, Jgi, and Jgp represent first-order approximation Jacobian matrixes when At' and Ap change with an accelerometer bias of the IMU sensor respectively, vf represents a velocity of the IMU sensor at the keyframe j, yr represents a rotation matrix of the coordinate system B of the 1MU sensor when rotated to the world coordinate system at the keyframe j, bf and bt represent a gyroscope bias and an accelerometer bias of the TMU sensor at the keyframe j respectively, ER and Et represent information matrixes of the biases of the IMU sensor when randomly walking and being pre-integrated respectively, b1 and b represent biases of the IMU sensor at the keyframes i and j respectively, and Atid represents a time interval from the key-frame i to the keyframe j; [0039] 543, only establishing a constraint relation between the current keyframe j and an adjacent keyframe j -1 for a state tracking of the mobile robot at the current keyframe j: 0* = argmin (Ek Eproj(kj) + EIMU -1,1)) , and = {1? JW yr, b, where 0 represents a set of objects required to be optimized, 4f represents a set of optimized objects 41) , k represents a sequence number of the feature point, j represents a sequence number of the current keyframe, RJWB represents a rotation matrix of the coordinate system B of the IMU sensor when rotated to the world coordinate system at the keyframe j, pi:" represents a translation of the IMU sensor in the world coordinate system at the keyframe j, vf represents the velocity of the IMU sensor at the keyframe j, and br and hi represent the gyroscope bias and the accelerometer bias of the IMU sensor at the keyframe j respectively; and a pose of the mobile robot, optimal values of the velocity and the biases of the IMU sensor at the current keyframe are solved, as the state tracking of the mobile robot; and [0040] S44, since the RGB-D sensor and IMU sensor are required to optimize a state of 9 degrees of freedom, if all the state variables are optimized for all common view keyframes, the amount of calculation is too large to realize real-time locating. In addition, the IMU sensor has zero bias. Thus, the longer the time from data at the current time is, the lower the reliability is.
[0041] Therefore, an optimization algorithm based on slide window is used to non-linearly optimize the state of the mobile robot. For a keyframe M before the current keyframe, a constraint relation between the pose of the mobile robot and the error term of the IMU sensor is established to perform optimization: 0' = argmin (Ek Epr i(k,i) + Eimu(i,n) , and = fRr,p11Y,pr,by,b11, where 0 represents the set of objects required to be optimized, cp. represents the set of optimized objects 0, k represents the sequence number of the feature point, i represents the sequence number of the keyframe, j represents the sequence number of the current keyframe, Rr represents a rotation matrix of the coordinate system B of the IMU sensor when rotated to the world coordinate system at the keyframe j, py represents a translation of the IMU sensor in the world coordinate system at the keyframe j, yr represents the velocity of the IMU sensor at the keyframe j, and by and by represent the gyroscope bias and the accelerometer bias of the EMU sensor at the keyframe j respectively.
[0042] For a keyframe before the current keyframe M+1, only a constraint relation between a pose and feature points of the keyframe that have a common viewpoint with the current keyframe is established to perform optimization: iff = argmin Ek Eproj(k, j) , and fRivB, PTI where 0 represents the set of objects required to be optimized, and 0-represents the set of optimized objects [0043] Further, the step S5 includes steps of [0044] S51, using a loop detection algorithm based on a bag-of-words model to determine whether there is a loop in an image acquired by the RGB-D sensor at a current keyframe; [0045] S52, if no loop, outputting a pose of the mobile robot at the current keyframe, ending the loop optimization, and then, tracking a subsequent state of the mobile robot; and if detecting the loop, performing the loop optimization on the pose of the RGB-D sensor and feature points based on the visual SLAM algorithm; and [0046] 553, optimizing the pose of the mobile robot, the feature points, and the error term of the 1MU sensor by using the non-linear optimization algorithm based on sliding window, and outputting the pose of the mobile robot at the current keyframe.
[0047] Compared with the prior art, the present application has advantages and effects as follows.
[0048] According to the locating method based on the RGB-D sensor and the IMU sensor disclosed by the present application, firstly; the RGB-D sensor and the IMU sensor are used to obtain data related to the mobile robot, and then the RGB-D sensor is initialized, the IMU sensor performs the pre-integration, and the mobile robot is initialized. After being initialized, the pose of the mobile robot is tracked and optimized at backend. When the loop is generated, the loop processing is performed to locate the mobile robot. The locating method combines the motion RGB-D sensor and IMU sensor, and overcomes the shortcomings of a pure visual SLAM algorithm being prone to tracking failure when rotating and the background is single, and the EMU sensor generating a zero bias during long-term motion.
BRIEF DESCRIPTION OF THE DRAWINGS
[0049] FIG. I is a flowchart illustrating a locating method based on a RGB-D sensor and an IMU sensor according to the present application.
[0050] FIG. 2 is a schematic diagram illustrating a sliding window algorithm used by the present application.
DETAILED DESCRIPTION OF THE EMBODIMENTS
[0051] In order to make objectives, technical solutions, and advantages of embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be described clearly and completely in conjunction with the accompanying drawings in the embodiments of the present application. Apparently, the described embodiments are a part of the embodiments, not all the embodiments of the present application. Based on the embodiments of the present application, all other embodiments obtained by those of ordinary skill in the art without creative work shall fall within the protection scope of the present application.
[0052] Embodiment This embodiment discloses a specific implementation process of a locating method based on a RGB-D sensor and an 1MG sensor, which includes steps of Si to S5 as follows.
[0053] At Si, the RGB-D sensor is a Kinect v2 sensor. The Kinect v2 sensor and the IMU sensor are fastened onto a mobile robot, i.e., turtlebot2 mobile robot. The Kinect v2 sensor, the IMU sensor, and the turtlebot2 mobile robot are connected to a mobile laptop via a universal serial bus (USB) interface, to control the turtlebot2 mobile robot to move in three-dimensional space. The Kinect v2 sensor and the IMU sensor acquire data, respectively. An acquiring frequency of the IMU sensor is greater than that of the Kinect v2 sensor.
[0054] At 52, a keyframe is selected by an ORB-SLAM2 algorithm, according to a frame interval between a current image frame and a previous keyframe acquired by the RGB-D sensor, and the number of common visual feature points between the current image frame and the previous keyframe.
[0055] An acceleration and an angular velocity may be measured from the IMU sensor, and a state at next time from a state at a current time may be calculated through IMU pre-integration. A formula of the calculation is as follows: ( = Rr ARi,i"Expugabn = + -Figybr) pi+i= pi + v iAt 1,1+ + + Riws (Api.i+1 + Jgpbr + Jgpbn where Rri represents a rotation matrix of a coordinate system B of the IMU sensor when rotated to the world coordinate system at a time i + 1, vi+1 and pill represent a velocity and a pose of the coordinate system B of the I MU sensor at the time i + 1 in the world coordinate system respectively, At" represents an interval from time i to the time i + 1 of the MU sensor, big and br represent a gyroscope bias and an accelerometer bias of the IMU sensor at the time i respectively, gw represents gravitational acceleration in the world coordinate system, Akin., J. and Api,t, represent pre-integrations of the rotation matrix, a velocity transformation, and a displacement transformation of the IMU sensor from the time i to the time i + 1 respectively, jgs, 17, and Jgp represent first-order approximation Jacobian matrixes when the AR, Ay and Ap change with a gyroscope bias of the IMU sensor respectively, FL, and jgp represent first-order approximation Jacobian matrixes when the Av and Ap change with an accelerometer bias 19f" of the IMU sensor respectively. [0056] At S3, a pose of the RGB-D sensor, biases and a velocity of the BTU sensor, a gravity vector, and a velocity are initialized. The step S3 includes steps of S31 to S33. [0057] At S31, a visual ORB-SLAM2 algorithm is used to initialize the RGB-D sensor, to obtain the pose of the RGB-D sensor. The step S31 includes steps of S311 to S313. [0058] At S311, an ORB feature extraction algorithm is used to extract ORB feature points of image data of the RGB-D sensor, and finding out three-dimensional space points corresponding to the ORB feature points according to depth data of the RGB-D sensor, thereby obtaining the pose of the RGB-D sensor according to pixel coordinates of the ORB feature points and three-dimensional coordinates of the space points.
[0059] At S312, for a space point Xk = [U k, Vk, WIT, its pixel coordinate in a coordinate system of the RGB-D sensor is uk = [xk, yk], and a correlation between the coordinates of the space points and the pixel coordinates is: Silk =Kexp(e)Xk, where s represents a scale factor of a depth map. K represents an intrinsic parameter matrix of the RGB-D sensor, represents the estimated pose, r represents an antisymmetric matrix corresponding to the pose, and k represents a serial number of the feature point.
[0060] At S313, the pose of the RGB-D sensor is optimized by constructing a least square relation, to minimize an error: 1 1 = argmbk-E7'1-111uk-Kexp(c)Xkll ' 2 -where represents an optimal pose after being optimized, .c represents the estimated pose, and it represents the number of feature points.
[0061] At 532, consecutive N keyframes are selected according to the visual SLAM algorithm. The IMU pre-integration equation is used to integrate gyroscope data of the IMU sensor between the consecutive N keyframes, to obtain a rotation transformation of the IMU sensor between the N keyframes. The gyroscope bias b9 of the IMU sensor is estimated by establishing a minimum error equation of rotation transformations of the RGB-D sensor and the IMU sensor between the N keyframes: argminEni-IlLo g ((AR i.i"Exp(J Lbg)) Fir±wiRrB)1 2 bg where N represents the number of keyframes, ARIA+1 represents a rotation matrix obtained through the IMU pre-integration between the consecutive keyframes i and i + 1, 'AR represents a first-order approximation Jacobian matrix when AR changes with = RiircRcs: Rwc the gyroscope bias of the 1MU sensor, Rra represents a rotation of the keyframe i as a rotation of a coordinate system of the RGB-D sensor relative to the world coordinate system obtained by the visual ORB-SLAM2 algorithm, Rit; represents a rotation matrix when the world coordinate system is transformed to a coordinate system of the IMU sensor at the keyframe I + 1, and RcB represents a calibrated rotation matrix when a calibrated IMU sensor is transformed to the coordinate system of the RGB-D sensor.
[0062] In this embodiment, two keyframes are used to estimate the gyroscope bias of the IMU sensor, and thus N is 2.
[0063] At S33, the accelerometer bias and the gravity vector of the IMU sensor are estimated by using an estimated result of the gyroscope bias b9 of the IMU sensor. The step S33 includes steps of 5331 to S333.
[0064] At S331, according to recurrence relation of the IMU pre-integration equation between three consecutive keyframes, and a transformation equation between the coordinate system of the IMU sensor and the coordinate system of the RGB-D sensor, a linear equation can be obtained: [a(i) 0)] 7:1 = Y(i) where a(i) =RRwi (g1)xG(464,46,t23 + at,2AL)1, (41,2) g(i) = Jg,"Ati2 M23 + RYE J gpnati2 -B, and y(0= (4: -pf)Atl, + (pY -pf)At23 + -12 AtuAt"Rwill,G(Atl, + At23) - (Rivc -Rr)pf3At23 + (Rrc -RrAP12at23 where SO, represents a second-order vector in SO =rY1 SO represents a gravity deviation vector, 19' represents an initialized accelerometer bias of the IMU sensor, RH' represents a direction of the IMU sensor in the world coordinate system, At12 and At23 represent time intervals from keyframes 1, 2 to keyframes 2, 3 of the IMU sensor respectively, G represents a reference value of gravity, 'al represents a gravity vector in the coordinate system of the IMU sensor, (,6)" represents a first-order approximation of the gravity vector 6, Fl t represents that only data of first two columns of the matrix is included, Rrs and RIP represent rotation matrixes of the coordinate system of the IMU sensor relative to the world coordinate system at the keyframes 1 and 2 respectively, Ap12 and 423 represent displacements of the IMU sensor between the keyframes I, 2 to the keyframes 2, 3 respectively, AV23 represents a variation in a velocity from the keyframe 2 to the keyframe 3, igivn, 1gp2,, and represent first-order approximation Jacobian matrixes when Av23, .423 and Ap12 change with the accelerometer bias of the IMU sensor respectively, Rn, Fer and Rr represent rotation matrixes of the coordinate system C of the RGB-D sensor relative to the world coordinate system at the keyframes I, 2, and 3 respectively, pf,pg and pg represent coordinates of a center of the coordinate system of the RGB-D sensor in the world coordinate system at the keyframes 1, 2, and 3 respectively, and pg represents coordinates of a center of the coordinate system of the IMU sensor in the coordinate system of the RGB-D sensor.
[0065] At S332, a gravity deviation second-order vector 50,y and the accelerometer bias 6' are solved through singular value decomposition.
[0066] At S333, a velocity v of the IMU sensor is calculated as an initialized velocity of the 1MU sensor through an equation: Rr)PiAt12 + RrBAV12At12At23 + R12" 423 Ati2 -v1At1i1+1= pf -pic" en (;41),<GAt?i"SO Rwr giGAqi" + (Rr -Rr+)pg + RriB(Api,i+,+igpbr) [0067] where Jgp represents a first-order approximation Jacobian matrix when Ap changes with the gyroscope bias bt of the IMTJ sensor, 73.,5 represents coordinates of the center of the coordinate system of the MIT sensor in the coordinate system of the RGB-D sensor, pF represents coordinates of the center of the coordinate system of the RGB-D sensor in the world coordinate system at the keyframe i, vi represents the velocity of the 1MU sensor at the keyframe i, Rirfc represents a rotation matrix of the coordinate system of the RGB-D sensor relative to the world coordinate at the keyframe i, and Rr" represents a rotation matrix of the coordinate system of the IMU sensor relative to the world coordinate system at the keyframe 1.
[0068] At S4, a state of the mobile robot is tracked. The state of the mobile robot is optimized by using a non-linear optimization based on sliding window. The state of the mobile robot includes the pose and the velocity of the mobile robot, and the biases of the IMTJ sensor. The step S4 includes steps of S41 to S43.
[0069] At S41, pose of the RGB-D sensor is estimated by using the visual ORB-SLAM2 algorithm.
100701 At S42, a constraint relation between the keyframes is established. A least square relation is established based on a reprojection error, according to an error term of the IMTJ sensor and the pose of the mobile robot.
100711 Eproi(lc, represents the reprojection error of the pose and the feature points, and is expressed as follows: Eproj(k,j) = (suk -Kexp(()4)TEk(suk -Kexp(r)X0, and 4 = RcBRiw (4' -pf) + p, [0072] Emu j) represents the error term of the WIC sensor, and is expressed as follows: Ermu = [ek ere;r31It [eTgelpi ± (eT/Reb) eR = Log ((alkjExp(JXRbf))79 Rr Rr) , = RPw(vP. vr (avid ± Pgvbr e = p7-pp -vr (APi.; +1t4b1.1 + Igpby) , and eb -hi -bi where k represents a sequence number of a feature point of an image, i represents a sequence number of the keyframe, j represents a sequence number of the current keyframe, s represents a scale factor of the depth map, uk represents two-dimensional coordinates of a le feature point in a camera coordinate system of the RGB-D sensor, K represents an intrinsic parameter matrix of the RGB-D sensor, represents an antisymmetric matrix of the pose of the RGB-D sensor, 4 represents coordinates of the kth feature point Xk in a coordinate system C of the RGB-D sensor, Ek represents a scale information matrix of the feature point Xk and a corresponding pixel point, I?' represents a calibrated rotation matrix when a calibrated IMU sensor is transformed to the coordinate system of the RGB-D sensor, Rr represents a rotation matrix of the world coordinate system relative to the coordinate system of the IMU sensor at the keyframe j, Xr represents coordinates of the le feature point Xk in the world coordinate system W, pf represents coordinates of the IMU sensor in the world coordinate system at the keyframe j, IA. represents coordinates of the center of the coordinate system of the IMU sensor in the coordinate system of the RGB-D sensor, Alto, Avid and Apt j represent pre-integrations of the rotation matrix, a velocity transformation, and a displacement transformation of the IMU sensor from the key-frame i to the keyframe j respectively, IL, hat, and jailp represent first-order approximation Jacobian matrixes when AR, Ay and ap change with the gyroscope bias of the IMU sensor respectively, jg, and jgp represent first-order approximation Jacobian matrixes when Ay and Ap change with the accelerometer bias of the EMU sensor respectively, vf represents a velocity of the INIU sensor at the keyframe j, Ry, represents a rotation matrix of the coordinate system B of the IMU sensor when rotated to the world coordinate system at the keyframe j, bf and Li represent a gyroscope bias and an accelerometer bias of the IMU sensor at the keyframe j respectively, ER and Et represent information matrixes of the biases of the IMU sensor when randomly walking and being pre-integrated respectively, 191 and la represent biases of the IMTJ sensor at the keyframes i and j respectively, and atu represents time difference from the keyframe i to the keyframe j.
100731 At S43, for the state tracking of the mobile robot at the current keyframe j, only a constraint relation between the current keyframe j and an adjacent keyframe j -1 is established.
04 = argmin (Ek Ep"j (k,j) EIMU 14)) and (75 = Pr, pr, viP, by, , where 0 represents a set of objects required to be optimized, 0* represents a set of optimized objects 4), k represents a sequence number of the feature point, j represents a sequence number of the current keyframe, Rj!" represents a rotation matrix of the coordinate system B of the IMU sensor when rotated to the world coordinate system at the keyframe j, pr represents a translation of the IMU sensor in the world coordinate system at the keyframe j, 43 represents the velocity of the IMU sensor at the keyframe j, and by and by represent the gyroscope bias and the accelerometer bias of the IMU sensor at the keyframe j respectively.
100741 A pose of the mobile robot, and optimal values of the velocity and the biases of the IMU sensor at the current keyframe are solved, as the state tracking of the mobile robot.
100751 At 4, the state of the mobile robot is non-linearly optimized by using an optimization based on sliding window For a keyframe M before the current keyframe, a constraint relation between the pose of the mobile robot and the error term of the IMU sensor is established to perform optimization.
0' = argmin (Ek Epr i(k,j) +E(Mu(i,j)) , and = {fr pr, by} , where 0 represents the set of objects required to be optimized, 0* represents the set of optimized objects 0, k represents the sequence number of the feature point, i represents a sequence number of the keyframe, j represents the sequence number of the current keyframe, Rys represents a rotation matrix of the coordinate system B of the IMTJ sensor when rotated to the world coordinate system at the keyframe j, pr7 represents a translation of the IMU sensor in the world coordinate system at the keyframe j, yr represents a velocity of the IMU sensor at the keyframe j, and by and by represent the gyroscope bias and the accelerometer bias of the EMU sensor at the keyframe j respectively.
[0076] For a keyframe before the current keyframe M+1, only a constraint relation between the pose and feature points of the keyframe that have a common viewpoint with the current keyframe is established to perform optimization: = argminEk Eproj (k, , and {RTB, 141 where represents the set of objects required to be optimized, and ift represents the set of optimized objects 0.
[0077] In this embodiment, the magnitude of M varies with the time interval between the current keyframe and the previous keyframe, and M varies according to the number of keyframes within 5s from the current keyframe.
[0078] At S5, a loop optimization is performed. The step 55 includes steps of 551 to S53.
[0079] At S51, a loop detection algorithm based on the bag-of-words model is used to determine whether there is a loop in an image acquired by the RGB-D sensor at the current keyframe.
100801 At S52, if no loop, the pose of the mobile robot at the current keyframe is output, the loop optimization is end, and then, the subsequent state of the mobile robot is tracked. If detecting the loop, the loop optimization is performed on the pose of the RGB-D sensor and feature points based on the visual ORB-SLAM2 algorithm.
[0081] At 553, the pose of the mobile robot, the feature points, and the error term of the IMU sensor is optimized by using the non-linear optimization algorithm based on sliding window. The pose of the mobile robot at the current keyframe is output.
[0082] The above embodiments are exemplary embodiments of the present application, but the embodiments of the present application are not limited by the above embodiments. And any other changes, modifications, substitutions, combinations, simplifies, etc. made without departing from the spirit and principle of the present application, all should be considered as equivalent replacement methods, which are all included in the protection scope of the present application.

Claims (7)

  1. What is claimed is: 1. A locating method based on a RGB-D sensor and an 1MU sensor, applicable to a mobile robot mounted with the RGB-D sensor and the IMU sensor, for locating the mobile robot, the locating method comprising steps of: Si, acquiring data via the RGB-D sensor and the 1MU sensor, wherein the mobile robot mounted with the RGB-D sensor and the IMU sensor freely moves in a three-dimensional space, and the RGB-D sensor and the 1MU sensor acquire data, respectively; 52, performing an IMTJ pre-integration, wherein a keyframe is selected by using a visual SLAM algorithm, according to quality of different frames of image data acquired by the RGB-D sensor, and based on the selected keyframe, a pre-integration of the IMU sensor between keyframes is calculated through an IMU pre-integration equation; 53, initializing the mobile robot, wherein a pose of the RGB-D sensor is obtained by initializing the RGB-D sensor using the visual SLAIVI algorithm, and then initialized state of the mobile robot is obtained by initializing the mobile robot according to the pose of the RGB-D sensor and the IMU pre-integration of the corresponding keyframe, and wherein the state of the mobile robot comprises a pose and a velocity of the mobile robot, and a bias of the 1MU sensor; S4, performing a tracking optimization on a state of the mobile robot, wherein the tracking optimization is performed on the estimated state of the mobile robot by using an optimization algorithm based on reprojection error, and then a nonlinear optimization is performed on the state of the mobile robot by using an optimization algorithm based on slide window; and 55, performing a loop detection, wherein according to the data acquired by the RGB-D sensor, a loop optimization is performed on the pose of the RGB-D sensor and feature points by using a loop optimization algorithm based on a bag-of-words model, and then an error term of the IMU sensor is optimized by using an optimization algorithm based on sliding window.
  2. 2. The locating method of claim I, wherein an acquiring frequency of the IMU sensor is greater than an acquiring frequency of the RGB-D sensor.
  3. 3. The locating method of claim 1, wherein a rule of using the visual SLAM algorithm to select the keyframe in the step 52 is as follows: the keyframe is selected according to a frame interval between a current image frame and a previous keyframe acquired by the KGB-ll sensor, and the number of common visual feature points between the current image frame and the previous keyframe.
  4. 4. The locating method of claim 1, wherein the IMU pre-integration is calculated in the step S2, a state at next time is calculated from a state at a current time through the IMTJ pre-integration by a formula as follows: IR117_,B1 = Rr AR1,1_,,Exp(IgRbig) v i +1 = V1 + 9wAti,i+i + Rr (av 0+, + pbp + jg"bil, pi+i= pi + viati,i+i ± thvAti2,i+i + Rr, (Api,i+i + Jgpbr + gpbfl) where Rr, represents a rotation matrix of a coordinate system B of the IMTJ sensor when rotated to a world coordinate system at a time i + 1, viil and pill represent a velocity and a pose of the coordinate system B of the 1MU sensor at the time i + 1 in the world coordinate system respectively, Atuil represents an interval from a time i to the time i + 1 of the TMU sensor, 147 and br represent a gyroscope bias and an accelerometer bias of the IMU sensor at the time i respectively, g, represents a gravitational acceleration in the world coordinate system; A Ru+i, A v+i and Ap1,1+1 represent pre-integrations of the rotation matrix, a velocity transformation, and a displacement transformation of the IMU sensor from the time i to the time i + 1 respectively, J. Iii, and Jgp represent first-order approximation Jacobian matrixes when AR, Av and Ap change with a gyroscope bias br of the IMU sensor respectively, and J, and Jgp represent first-order approximation Jacobian matrixes when Av and Ap change with an accelerometer bias 14, of the IMU sensor respectively.
  5. 5. The locating method of claim 1, wherein the step S3 comprises steps of S31, using the visual SLAM algorithm to initialize the RGB-D sensor, to obtain the pose of the RGB-D sensor, which comprises steps of: 5311, using an ORB feature extraction algorithm to extract ORB feature points of image data acquired by the RGB-D sensor, and finding out three-dimensional space points corresponding to the ORB feature points according to depth data of the RGB-D sensor, thereby obtaining the pose of the RGB-D sensor according to pixel coordinates of the ORB feature points and three-dimensional coordinates of the space points; 5312, for a space point Xk = [Uk,17k,Wk]T, its pixel coordinate in a coordinate system of the RGB-D sensor being uk = [xk, yk] , and a correlation between the coordinates of the space points and the pixel coordinates being: suk = Kexp(r)Xk, where s represents a scale factor of a depth map, K represents an intrinsic parameter matrix of the RGB-D sensor, represents estimated pose, r represents an antisymmetric matrix corresponding to the pose, and k represents a serial number of the feature point: and 5313, optimizing the pose of the RGB-D sensor by constructing a least square relation, to minimize an error: = argminri uk e- 1 z=ll -sKexp(r)X k 2 where represents an optimal pose after being optimized, represents the estimated pose, and n represents the number of feature points; 532, selecting consecutive N keyframes according to the visual SLAM algorithm, using the IMU pre-integration equation to integrate gyroscope data of the IMU sensor between the consecutive N keyframes, to obtain a rotation transformation of the IMU sensor between the N keyframes, and estimating a gyroscope bias bg of the IMU sensor by establishing a minimum error equation of rotation transformations of the RGB-D sensor and the IMU sensor between the Nkeyframes: argbmin j11ltog ((AR,+1ExpOLbg))T elf Rt")II where N represents the number of keyframes, ARt,i+i represents a rotation matrix obtained through the EMU pre-integration between a keyframe i and a keyframe i + 1, JAR represents a first-order approximation Jacobian matrix when AR changes with a gyroscope bias of the IMU sensor, Rrf, ,RreR,B, Rrc represents a rotation of the keyframe i as a rotation of the coordinate system of the RGB-D sensor relative to a world coordinate system obtained by the visual SLAM algorithm, Rff, represents a rotation matrix when the world coordinate system is transformed to a coordinate system of the IMU sensor at the keyframe i + 1, and R" represents a calibrated rotation matrix when a calibrated IMU sensor is transformed to the coordinate system of the RGB-D sensor; and S33, estimating an accelerometer bias and a gravity vector of the IMU sensor by using an estimated result of the gyroscope bias bg of the IMU sensor, which comprises steps of: S33 I, according to recurrence relation of the IMU pre-integration equation between three consecutive keyframes, and a transformation equation between the coordinate system of the IMU sensor and the coordinate system of the RGB-D sensor, obtaining a linear equation: [a(i) fl(i)] b = Y(i) , where a(i) = [-21 Rwi (th)xG(AtT,at23 + at12Ab)1, (41,2) fl U) = Rif jgvnAt"At23 + R12707,23At12 Rrop12At23, and KO = PDAttz + PDAt23 + A tuAt23 R G (A + A t, 3) - (Rtvc -R 12./Ifc)13.,g t23 + (Rig!' -RIAIRAP12at23 where Sexy represents a second-order vector in SO =1Se01, Se represents a gravity deviation vector, ba represents an initialized accelerometer bias of the IMU sensor, Rw' represents a direction of the IMU sensor in the world coordinate system, At12 and At23 represent time interval from keyframes 1, 2 to keyframes 2, 3 of the IMU sensor respectively, G represents a reference value of gravity, fi, represents a gravity vector in the coordinate system of the IMU sensor; (äï)>< represents a first-order Rf)g+RiE6 tt+RP39 112412423 P23 "12 approximation of the gravity vector,971/, [](41,2) represents that only data of first two columns of a matrix is included, Rir and represent rotation matrixes of the coordinate system of the WU sensor relative to the world coordinate system at the keyframes 1 and 2 respectively, AP12 and Ap23 represent displacements of the IMU sensor from the key-frames 1, 2 to the keyframes 2, 3 respectively, v23 represents a variation in a velocity from the keyframe 2 to the keyframe 3, JX,mn, Jgpr, and Jgpn represent first-order approximation Jacobian matrixes when Av23, Ap23 and Ap12 change with the accelerometer bias of the IMU sensor respectively, Ri'vc,Rir and nic represent rotation matrixes of the coordinate system C of the RGB-D sensor relative to the world coordinate system at the keyframes 1, 2, and 3 respectively, pf, and ig represent coordinates of a center of the coordinate system of the RGB-D sensor in the world coordinate system at the keyframes 1, 2, and 3 respectively, and represents coordinates of a center of the coordinate system of the IMU sensor in the coordinate system of the RGB-D sensor; S332, solving a gravity deviation second-order vector SOxy and the accelerometer bias b through singular value decomposition; and S333, calculating a velocity v of the IMU sensor as an initialized velocity of the IMTJ sensor through an equation: viAti,i÷i = pf -pic" Rw101),<GAq,1"80 + (Rrm -Rr+ci)ph. + Rr(Api,,,,i+Jgpbr) where jg, represents a first-order approximation Jacobian matrix when Ap changes with the gyroscope bias of the IMU sensor, p4c represents coordinates of the center of the coordinate system of the IMU sensor in the coordinate system of the RGB-D sensor, pic represents coordinates of the center of the coordinate system of the RGB-D sensor in the world coordinate system at the keyframe i, vi represents the velocity of the IMU sensor at the keyframe i, RP' represents a rotation matrix of the coordinate system of the RGB-D sensor relative to the world coordinate at the keyframe i, Rri3 represents a rotation matrix of the coordinate system of the IMU sensor relative to the world coordinate system at the keyframe i, and At" represents a time interval from the keyframe i to the keyframe i + 1 of the BTU sensor.
  6. 6. The locating method of claim 1, wherein the step S4 comprises steps of: 541, estimating a pose of the RGB-D sensor by using the visual SLAM algorithm; S42, establishing a constraint relation between keyframes, and establishing a least square relation based on a reprojection error, according to an error term of the MU sensor and the pose of the mobile robot, wherein: Ep"j (k) j) represents the reprojection error of the pose and the feature points, and is expressed as follows: Epro i(k,j) = (suk -KexpV A)X Ek(suk -K expR-APO, and vC = R cH lqw (X17 -pn + ph. , EIJI/111(i, j) represents the error term of the IMU sensor, and is expressed as follows: Elm u (i) j) = [ek e] 1[ejer enT + (eTERe b) , eR = Log ((ARid Exp(iiRbf)) RP' R)1v8) , ev = -vr _ g"Ati,i) - +/Lb; + g",1717) e = py -or Ati * --21gwAt6)-(Apid + gpb; Jgpby) , and e b = hi -bi, where k represents a sequence number of a feature point of an image, i represents a sequence number of the keyframe, j represents a sequence number of a current keyframe, s represents a scale factor of a depth map, uk represents two-dimensional coordinates of a le feature point in a camera coordinate system of the RGB-D sensor, K represents an intrinsic parameter matrix of the RGB-D sensor, represents an antisymmetric matrix of the pose of the RGB-D sensor, 4 represents coordinates of the kill feature point X k in a coordinate system C of the RGB-D sensor, Ek represents a scale information matrix of the feature point Xk and a corresponding pixel point, RcB represents a calibrated rotation matrix when a calibrated IMU sensor is transformed to the coordinate system of the RGB-D sensor, Rfw represents a rotation matrix of a world coordinate system relative to a coordinate system of the IMU sensor at the keyframe j, X': represents coordinates of the le feature point 4 in the world coordinate system W; !if represents coordinates of the IMU sensor in the world coordinate system at the keyframe j, /3;3. represents coordinates of a center of the coordinate system of the IMU sensor in the coordinate system of the RGB-D sensor, ARid, Avo and apt j represent pre-integrations of a rotation matrix, a velocity transformation, and a displacement transformation of the IMU sensor from the keyframe i to the keyframe j respectively, jg,,,ig,, and jgp represent first-order approximation Jacobian matrixes when AR, Av and Ap change with a gyroscope bias of the EMU sensor respectively, jg, and Jgp represent first-order approximation Jacobian matrixes when Ai' and Ap change with an accelerometer bias of the IMU sensor respectively, vf represents a velocity of the IMU sensor at the keyframe Rpt, represents a rotation matrix of the coordinate system B of the IMU sensor when rotated to the world coordinate system at the keyframe j, b./9 and b1 9 represent a gyroscope bias and an accelerometer bias of the IMU sensor at the keyframe j respectively, ER and El represent information matrixes of biases of the IMU sensor when randomly walking and being pre-integrated respectively, bi and bi represent biases of the IMU sensor at the keyframes i and] respectively, and Mid represents a time interval from the keyframe i to the keyframe j; S43, only establishing a constraint relation between the current keyframe j and an adjacent keyframe j -1 for a state tracking of the mobile robot at the current keyframe 0* = argmin (Ek Eproj(kj) Eimu -1,n) , and = {Rrn,pr,v7,br, where 0 represents a set of objects required to be optimized, 0' represents a set of optimized objects, k represents a sequence number of the feature point, j represents a sequence number of the current keyframe, RjvB represents a rotation matrix of the coordinate system B of the IMU sensor when rotated to the world coordinate system at the keyframe j, pr represents a translation of the IMU sensor in the world coordinate system at the keyframe j, vf represents the velocity of the IMU sensor at the keyframe j, and by and by represent the gyroscope bias and the accelerometer bias of the IMU sensor at the keyframe j respectively, A pose of the mobile robot, and optimal values of the velocity and the biases of the IMU sensor at the current keyframe are solved, as the state tracking of the mobile robot; and S44, using an optimization based on sliding window to non-linearly optimize the state of the mobile robot, and establishing a constraint relation between the pose of the mobile robot, the feature points, and the error term of the IMU sensor to perform optimization for a keyframe M before the current keyframe: 4)* = argmin (Ek Elm,/ (k, j) + Emu (Li)) , and pr, pr, bp, by, by} , where 4) represents the set of objects required to be optimized, (V represents the set of optimized objects 4), k represents the sequence number of the feature point, i represents the sequence number of the keyframe, j represents the sequence number of the current keyframe, Rigs represents a rotation matrix of the coordinate system B of the IMU sensor when rotated to the world coordinate system at the keyframe j, represents a translation of the IMU sensor in the world coordinate system at the keyframe j, 43 represents the velocity of the IMU sensor at the keyframe j, and by and bi represent the gyroscope bias and the accelerometer bias of the EMU sensor at the keyframe j respectively, only establishing a constraint relation between a pose and feature points of the keyframe that have a common viewpoint with the current keyframe to perform optimization for a keyframe before the current keyframe M+ l: argmin Ek Eproj (k, j) , and = fRy",pil, where i represents the set of objects required to be optimized, and rif represents the set of optimized objects IP.
  7. 7. The locating method of claim I, wherein the step S5 comprises steps of: S51, using a loop detection algorithm based on a bag-of-words model to determine whether there is a loop in an image acquired by the RGB-D sensor at a current keyframe; S52, if no loop, outputting a pose of the mobile robot at the current keyframe, ending the loop optimization, and then, tracking a subsequent state of the mobile robot; and if detecting the loop, performing the loop optimization on the pose of the RGB-D sensor and the feature points based on the visual SLAM algorithm; and 553, optimizing the pose of the mobile robot, the feature points, and the error term of the 1MU sensor by using the non-linear optimization algorithm based on sliding window, and outputting the pose of the mobile robot at the current keyframe.
GB2116617.8A 2020-03-11 2021-03-10 RGBD sensor and IMU sensor-based positioning method Pending GB2597632A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010166801.0A CN111462231B (en) 2020-03-11 2020-03-11 Positioning method based on RGBD sensor and IMU sensor
PCT/CN2021/080007 WO2021180128A1 (en) 2020-03-11 2021-03-10 Rgbd sensor and imu sensor-based positioning method

Publications (2)

Publication Number Publication Date
GB202116617D0 GB202116617D0 (en) 2022-01-05
GB2597632A true GB2597632A (en) 2022-02-02

Family

ID=71680707

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2116617.8A Pending GB2597632A (en) 2020-03-11 2021-03-10 RGBD sensor and IMU sensor-based positioning method

Country Status (3)

Country Link
CN (1) CN111462231B (en)
GB (1) GB2597632A (en)
WO (1) WO2021180128A1 (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111462231B (en) * 2020-03-11 2023-03-31 华南理工大学 Positioning method based on RGBD sensor and IMU sensor
CN112085776B (en) * 2020-07-31 2022-07-19 山东科技大学 Direct method unsupervised monocular image scene depth estimation method
CN112179373A (en) * 2020-08-21 2021-01-05 同济大学 Measuring method of visual odometer and visual odometer
CN112461228B (en) * 2020-11-03 2023-05-09 南昌航空大学 IMU and vision-based secondary loop detection positioning method in similar environment
CN113159197A (en) * 2021-04-26 2021-07-23 北京华捷艾米科技有限公司 Pure rotation motion state judgment method and device
CN113610001B (en) * 2021-08-09 2024-02-09 西安电子科技大学 Indoor mobile terminal positioning method based on combination of depth camera and IMU
CN114234967B (en) * 2021-12-16 2023-10-20 浙江大学 Six-foot robot positioning method based on multi-sensor fusion
CN114549656A (en) * 2022-02-14 2022-05-27 希姆通信息技术(上海)有限公司 Calibration method for AR (augmented reality) glasses camera and IMU (inertial measurement Unit)
CN114913245B (en) * 2022-06-08 2023-07-28 上海鱼微阿科技有限公司 Multi-calibration-block multi-camera calibration method and system based on undirected weighted graph
CN115855042A (en) * 2022-12-12 2023-03-28 北京自动化控制设备研究所 Pedestrian visual navigation method based on laser radar cooperative assistance
CN116442248A (en) * 2023-06-19 2023-07-18 山东工程职业技术大学 Robot vision positioning module and method based on target detection
CN117272593B (en) * 2023-08-24 2024-04-05 无锡北微传感科技有限公司 Wind tunnel test data analysis processing method
CN116883502B (en) * 2023-09-05 2024-01-09 深圳市智绘科技有限公司 Method, device, medium and equipment for determining camera pose and landmark point

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108846867A (en) * 2018-08-29 2018-11-20 安徽云能天智能科技有限责任公司 A kind of SLAM system based on more mesh panorama inertial navigations
CN109307508A (en) * 2018-08-29 2019-02-05 中国科学院合肥物质科学研究院 A kind of panorama inertial navigation SLAM method based on more key frames
CN109993113A (en) * 2019-03-29 2019-07-09 东北大学 A kind of position and orientation estimation method based on the fusion of RGB-D and IMU information
US20190294903A1 (en) * 2018-03-20 2019-09-26 Vangogh Imaging, Inc. 3d vision processing using an ip block
CN110345944A (en) * 2019-05-27 2019-10-18 浙江工业大学 Merge the robot localization method of visual signature and IMU information
CN111462231A (en) * 2020-03-11 2020-07-28 华南理工大学 Positioning method based on RGBD sensor and IMU sensor

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190294903A1 (en) * 2018-03-20 2019-09-26 Vangogh Imaging, Inc. 3d vision processing using an ip block
CN108846867A (en) * 2018-08-29 2018-11-20 安徽云能天智能科技有限责任公司 A kind of SLAM system based on more mesh panorama inertial navigations
CN109307508A (en) * 2018-08-29 2019-02-05 中国科学院合肥物质科学研究院 A kind of panorama inertial navigation SLAM method based on more key frames
CN109993113A (en) * 2019-03-29 2019-07-09 东北大学 A kind of position and orientation estimation method based on the fusion of RGB-D and IMU information
CN110345944A (en) * 2019-05-27 2019-10-18 浙江工业大学 Merge the robot localization method of visual signature and IMU information
CN111462231A (en) * 2020-03-11 2020-07-28 华南理工大学 Positioning method based on RGBD sensor and IMU sensor

Also Published As

Publication number Publication date
WO2021180128A1 (en) 2021-09-16
CN111462231A (en) 2020-07-28
CN111462231B (en) 2023-03-31
GB202116617D0 (en) 2022-01-05

Similar Documents

Publication Publication Date Title
GB2597632A (en) RGBD sensor and IMU sensor-based positioning method
CN111833333A (en) Binocular vision-based boom type tunneling equipment pose measurement method and system
CN110849367B (en) Indoor positioning and navigation method based on visual SLAM fused with UWB
CN110070615B (en) Multi-camera cooperation-based panoramic vision SLAM method
CN108955718A (en) A kind of visual odometry and its localization method, robot and storage medium
CN109781092B (en) Mobile robot positioning and mapping method in dangerous chemical engineering accident
CA2153693C (en) Method for determining borehole direction
CN111024066A (en) Unmanned aerial vehicle vision-inertia fusion indoor positioning method
CN109648558B (en) Robot curved surface motion positioning method and motion positioning system thereof
CN107990899A (en) A kind of localization method and system based on SLAM
CN112734841B (en) Method for realizing positioning by using wheel type odometer-IMU and monocular camera
CN106384353A (en) Target positioning method based on RGBD
CN108170297B (en) Real-time six-degree-of-freedom VR/AR/MR device positioning method
CN111750853A (en) Map establishing method, device and storage medium
CN208751577U (en) A kind of robot indoor locating system
CN106574836A (en) A method for localizing a robot in a localization plane
CN106403942A (en) Personnel indoor inertial positioning method based on depth image recognition in transformer substation
CN112015187B (en) Semantic map construction method and system for intelligent mobile robot
CN108958232A (en) A kind of mobile sweeping robot SLAM device and algorithm based on deep vision
CN109764869A (en) A kind of positioning of autonomous crusing robot and the three-dimensional map construction method of binocular camera and inertial navigation fusion
CN111508026A (en) Vision and IMU integrated indoor inspection robot positioning and map construction method
CN111307146A (en) Virtual reality wears display device positioning system based on binocular camera and IMU
CN114627253A (en) Map construction method, device and equipment
CN112258633B (en) SLAM technology-based scene high-precision reconstruction method and device
CN113466791B (en) Laser mapping and positioning equipment and method

Legal Events

Date Code Title Description
789A Request for publication of translation (sect. 89(a)/1977)

Ref document number: 2021180128

Country of ref document: WO