WO2021180128A1 - 一种基于rgbd传感器和imu传感器的定位方法 - Google Patents
一种基于rgbd传感器和imu传感器的定位方法 Download PDFInfo
- Publication number
- WO2021180128A1 WO2021180128A1 PCT/CN2021/080007 CN2021080007W WO2021180128A1 WO 2021180128 A1 WO2021180128 A1 WO 2021180128A1 CN 2021080007 W CN2021080007 W CN 2021080007W WO 2021180128 A1 WO2021180128 A1 WO 2021180128A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensor
- imu
- coordinate system
- imu sensor
- key frame
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 30
- 238000005457 optimization Methods 0.000 claims abstract description 46
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 45
- 230000000007 visual effect Effects 0.000 claims abstract description 29
- 238000001514 detection method Methods 0.000 claims abstract description 6
- 239000011159 matrix material Substances 0.000 claims description 94
- 230000010354 integration Effects 0.000 claims description 30
- 230000005484 gravity Effects 0.000 claims description 29
- 230000008859 change Effects 0.000 claims description 21
- 230000009466 transformation Effects 0.000 claims description 16
- 230000001133 acceleration Effects 0.000 claims description 10
- 238000006073 displacement reaction Methods 0.000 claims description 10
- 230000008569 process Effects 0.000 claims description 7
- 238000013519 translation Methods 0.000 claims description 6
- 238000006243 chemical reaction Methods 0.000 claims description 3
- 238000000354 decomposition reaction Methods 0.000 claims description 3
- 238000000605 extraction Methods 0.000 claims description 3
- 238000005295 random walk Methods 0.000 claims description 3
- 238000013480 data collection Methods 0.000 claims description 2
- 238000005070 sampling Methods 0.000 description 7
- 230000033001 locomotion Effects 0.000 description 6
- 238000005259 measurement Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 230000004927 fusion Effects 0.000 description 2
- 230000004807 localization Effects 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 238000009795 derivation Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 239000006185 dispersion Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
- G01C21/206—Instruments for performing navigational calculations specially adapted for indoor navigation
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
- G05D1/0248—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means in combination with a laser
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/027—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising intertial navigation means, e.g. azimuth detector
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/579—Depth or shape recovery from multiple images from motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Definitions
- This application relates to the technical field of visual SLAM and sensor fusion, and in particular to a positioning method based on RGBD sensors and IMU sensors.
- Simultaneous Localization and Mapping for short, SLAM, first proposed by SmithSelf and Cheeseman in 1988, is considered to be the key to a truly fully autonomous mobile robot.
- the SLAM problem can be described as: the robot is in an unknown environment Start to move from an unknown location, locate itself based on location estimation and sensor data during the movement, and build an incremental map at the same time.
- the SLAM that only obtains environmental information through the camera is called visual SLAM. Because the camera has the characteristics of low price, low power consumption, light weight, small size, and rich image information, visual SLAM has been studied by a large number of scholar and experts at home and abroad.
- IMU Inertial Measurement Unit
- IMU Inertial Measurement Unit
- a three-axis gyroscope and three-directional accelerometer are installed in an IMU sensor to measure the angular velocity and acceleration of the object in three-dimensional space, and to calculate the posture of the object.
- IMU sensors are commonly used in equipment that requires motion control, such as automobiles and robots.
- the visual SLAM algorithm can estimate its own pose information in the three-dimensional space through the camera sensor data, it is easy to cause the tracking loss of feature points during fast motion or pure rotation.
- the visual SLAM generally takes the first frame as the world coordinate system to estimate it.
- the pose information is relative to the pose information of the first frame of image, rather than the world coordinate system such as the northeast sky, and the IMU sensor can provide pose information at all times, but there is a zero drift problem in long-term motion.
- the estimated pose information is biased, and the loop in visual SLAM can solve this problem well. Therefore, the fusion of visual SLAM and IMU sensors can better estimate the pose information of the robot in the unknown space.
- a positioning method based on an RGBD sensor and an IMU sensor is provided.
- a positioning method based on RGBD sensor and IMU sensor is applied to a mobile robot equipped with RGBD sensor and IMU sensor to realize the positioning of the mobile robot.
- the positioning method includes the following steps:
- the mobile robot is equipped with RGBD sensor and IMU sensor to move freely in three-dimensional space, and collect RGBD sensor data and IMU sensor data;
- IMU pre-integration using visual SLAM algorithm to select key frames according to the image data quality of different frames of RGBD sensor, based on the selected key frames, using IMU pre-integration equation to calculate the pre-integration value of IMU sensor data between key frames;
- Mobile robot initialization Use the visual SLAM algorithm to initialize the RGBD sensor to obtain the pose information of the RGBD sensor, and then initialize the mobile robot according to the pose information of the RGBD sensor and the IMU pre-integration corresponding to the key frame to obtain the initialization state information of the mobile robot.
- the state information of the mobile robot includes the pose information, speed, and bias of the IMU sensor of the mobile robot;
- Tracking optimization of the mobile robot state using an optimization algorithm based on reprojection error to optimize the tracking of the estimated state information of the mobile robot, and then using an optimization algorithm based on a sliding window to perform nonlinear optimization on the state of the mobile robot;
- Loopback detection According to the RGBD sensor data, a loopback optimization algorithm based on the bag-of-words model is used to loopback optimization of the pose information and feature points of the RGBD sensor, and then an optimization algorithm based on a sliding window is used to optimize the error term of the IMU sensor.
- the sampling frequency of the IMU sensor is greater than the sampling frequency of the RGBD sensor, the sampling frequency of the IMU sensor is more than 100 Hz, and the sampling frequency of the RGBD sensor is about 30 Hz. Since the sampling frequency of the IMU sensor is much greater than the sampling frequency of the RGBD sensor, Therefore, it is necessary to align the data through IMU pre-integration.
- the rules for selecting key frames using the visual SLAM algorithm in step S2 in step S2 are as follows: according to the frame number interval between the current image frame collected by the RGBD sensor and the previous key frame, and the current The number of common-view feature points between the image frame and the previous key frame is selected.
- the IMU pre-integration is calculated in the step S2, and the acceleration and angular velocity are measured by the IMU sensor:
- ⁇ (t) and a(t) represent the true values of angular velocity and acceleration respectively, with IMU sensor measurement values represent the angular velocity and acceleration
- R WB represents IMU sensor coordinate system to the world coordinate system rotation matrix
- b g (t) and b a (t) represents the time varying IMU gyroscope bias and an acceleration sensor Gauss bias
- ⁇ g (t) and ⁇ a (t) represent the gyroscope and accelerometer white Gauss noise of the IMU sensor that change with time
- g w represents the gravity vector in world coordinates
- the pre-integration between the two moments is calculated, and the IMU sensor accelerometer bias and gyroscope bias are separated, and pre-integrated through IMU According to the state at the current moment, the state at the next moment can be calculated:
- step S3 includes:
- the visual SLAM algorithm initializes the RGBD sensor and obtains the pose information of the RGBD sensor. The process is as follows:
- the ORB feature extraction algorithm is used to extract the ORB feature points of the image data of the RGBD sensor, and find the corresponding three-dimensional space points according to the depth data of the RGBD sensor corresponding to the ORB feature points, thereby according to the pixel coordinates of the ORB feature points and the three-dimensional coordinates of the space points Obtain the pose information of the RGBD sensor;
- su k Kexp( ⁇ )X k
- K represents the internal parameter matrix of the RGBD sensor
- ⁇ represents the estimated pose information
- ⁇ represents the antisymmetric matrix corresponding to the pose information
- k represents the serial number of the feature point
- ⁇ * represents the optimal pose information after optimization
- ⁇ represents the estimated pose information
- n represents the number of feature points
- the IMU pre-integration equation is used to integrate the data of the IMU sensor gyroscope between the N consecutive key frames to obtain the rotation of the IMU sensor between the N key frames Transformation, by establishing the minimum error equation of the rotation transformation of the RGBD sensor and the IMU sensor between N key frames, the bias b g of the IMU sensor gyroscope is estimated:
- N represents the number of key frames
- Is the rotation of the RGBD sensor coordinate system relative to the world coordinate system obtained by the visual SLAM algorithm of the key frame i Represents the rotation matrix of the world coordinate system converted to the coordinate system of the IMU sensor at the key frame i+1
- R CB is the calibration rotation matrix converted from the calibrated IMU sensor to the RGBD sensor coordinate system
- ⁇ R i,i+1 are two consecutive
- the rotation matrix obtained by the IMU integration between key frames The Jacobian matrix representing the first-order approximation of the variation equation of ⁇ R with IMU sensor gyroscope bias;
- R WI represents the direction of the IMU sensor in the world coordinate system
- the gravity vector of the IMU sensor can be expressed as:
- RGBD sensor coordinate system C According to the transformation between RGBD sensor coordinate system C and IMU sensor coordinate system B:
- the Jacobian matrix representing the first-order approximation of the ⁇ p variation equation with the gyroscope bias of the IMU sensor Indicates the coordinates of the center of the IMU sensor coordinate system in the RGBD sensor coordinate system, Respectively represent the i-th keyframe center coordinates RGBD sensor coordinate system in the world coordinate system, represents the velocity V i at the i-th keyframe IMU sensor, Represents the rotation matrix of the coordinate system of the RGBD sensor relative to the world coordinate in the i-th key frame, Respectively represent the rotation matrix of the IMU sensor coordinate system relative to the world coordinate system at the i-th key frame, and ⁇ t i, i+1 represents the time interval from the i-th key frame to the i+1-th key frame of the IMU sensor.
- ⁇ xy is In the second-order vector
- ⁇ is the gravity deviation vector
- b a represents the initial IMU sensor accelerometer bias
- R WI represents the direction of the IMU sensor in the world coordinate system
- ⁇ t 12 and ⁇ t 23 respectively represent key frames 1, 2 to key
- G represents the gravity reference value
- [] (:,1:2) means that only the first two columns of the matrix are included
- ⁇ p 12 and ⁇ p 23 represent the displacement of the IMU sensor from key frames 1, 2 to key frames 2, 3 respectively, and ⁇ v 23 represents the key frame The speed change from 2 to key frame 3, They are the first-order approximation Jacobian matrix of ⁇ v 23 , ⁇ p 23 , and ⁇ p 12 with the IMU sensor accelerometer bias change equation, Respectively represent the rotation matrix of the RGBD sensor coordinate system C relative to the world coordinate system at key frames 1, 2, and 3, Respectively represent the coordinates of the center of the RGBD sensor coordinate system in the world coordinate system at key frames 1, 2, and 3. Indicates the coordinates of the center of the IMU sensor coordinate system in the RGBD sensor coordinate system;
- the Jacobian matrix representing the first-order approximation of the ⁇ p variation equation with the gyroscope bias of the IMU sensor Indicates the coordinates of the center of the IMU sensor coordinate system in the RGBD sensor coordinate system, Respectively represent the i-th keyframe center coordinates RGBD sensor coordinate system in the world coordinate system, represents the velocity V i at the i-th keyframe IMU sensor, Represents the rotation matrix of the coordinate system of the RGBD sensor relative to the world coordinate in the i-th key frame, Respectively represent the rotation matrix of the IMU sensor coordinate system relative to the world coordinate system at the i-th key frame, and ⁇ t i, i+1 represents the time interval from the i-th key frame to the i+1-th key frame of the IMU sensor.
- step S4 is as follows:
- E proj (k,j) is the reprojection error of the pose information and feature points, expressed as follows:
- E IMU (i,j) is the error term of the IMU sensor, expressed as follows:
- k represents the feature point number of the image
- i represents the key frame number
- j represents the current key frame number
- s represents the scale factor of the depth map
- u k represents the two-dimensional coordinates of the RGBD sensor camera coordinates corresponding to the k-th feature point
- K represents the internal parameter matrix of the RGBD sensor
- ⁇ represents the antisymmetric matrix of the RGBD sensor pose information ⁇
- ⁇ k is the scale information matrix of the feature point X k and the corresponding pixel point
- R CB is the calibration of the calibrated IMU sensor converted to the RGBD sensor coordinate system
- Rotation matrix Represents the rotation matrix of the world coordinate system relative to the IMU sensor coordinate system at the j-th key frame time
- ⁇ represents the set of objects that need to be optimized
- ⁇ * represents the set of ⁇ objects after optimization
- k represents the sequence number of the feature point
- j represents the current key frame sequence number
- a sliding window-based optimization algorithm is used to non-linearly optimize the state of the mobile robot.
- the pose information of the mobile robot and the IMU sensor error term are established to establish a constraint relationship for optimization:
- ⁇ represents the set of objects that need to be optimized
- ⁇ * represents the set of ⁇ objects after optimization
- k represents the sequence number of the feature point
- i represents the key frame sequence number
- j represents the current key frame sequence number
- ⁇ represents the set of objects to be optimized
- ⁇ * represents the set of ⁇ objects after optimization.
- step S5 is as follows:
- a non-linear optimization algorithm based on a sliding window optimizes the mobile robot's pose information, feature points, and IMU sensor error terms, and outputs the pose of the mobile robot in the current key frame.
- This application discloses a positioning method based on the RGBD sensor and the IMU sensor.
- the RGBD sensor and the IMU sensor are used to obtain the relevant data of the mobile robot, and then the RGBD sensor is initialized and the IMU sensor is pre-integrated, and the mobile robot is initialized.
- the posture information of the mobile robot is tracked and the back-end optimization is performed, and the loop processing is performed when the loop is generated, so as to realize the positioning of the mobile robot.
- the positioning method integrates motion RGBD sensor and IMU sensor, and overcomes the shortcomings of pure visual SLAM algorithm that it is easy to track failure when rotating and the background is single, and that the IMU sensor produces zero bias during long-term motion.
- Fig. 1 is a flowchart of a positioning method based on RGBD sensor and IMU sensor disclosed in the present application;
- Figure 2 is a schematic diagram of the sliding window algorithm used in this application.
- This embodiment discloses a specific implementation process of a positioning method based on an RGBD sensor and an IMU sensor, and the steps are as follows;
- RGBD sensor adopts Kinect v2 sensor
- the Kinect v2 sensor and IMU sensor are fixed to the turtlebot2 mobile robot
- the Kinect v2 sensor, IMU sensor and turtlebot2 mobile robot are connected to the mobile notebook computer through the USB interface to control the turtlebot2 mobile robot in three dimensions.
- the collection frequency of the IMU sensor is greater than the collection frequency of the Kinect v2 sensor.
- the key frame is selected through the ORBSLAM2 algorithm, and the selection is made according to the frame number interval between the current image frame collected by the RGBD sensor and the previous key frame, and the number of common view feature points between the current image frame and the previous key frame;
- the acceleration and angular velocity can be measured, and the state at the next moment can be calculated from the state at the current moment through IMU pre-integration.
- the calculation formula is as follows:
- the vision ORBSLAM2 algorithm initializes the RGBD sensor and obtains the pose information of the RGBD sensor:
- the ORB feature extraction algorithm is used to extract the ORB feature points of the image data of the RGBD sensor, and find the corresponding three-dimensional space points according to the depth data of the RGBD sensor corresponding to the ORB feature points, thereby according to the pixel coordinates of the ORB feature points and the three-dimensional coordinates of the space points Obtain the pose information of the RGBD sensor;
- su k Kexp( ⁇ )X k
- K represents the internal parameter matrix of the RGBD sensor
- ⁇ represents the estimated pose information
- ⁇ represents the antisymmetric matrix corresponding to the pose information
- k represents the serial number of the feature point
- ⁇ * represents the optimal pose information after optimization
- ⁇ represents the estimated pose information
- n represents the number of feature points
- the IMU pre-integration equation is used to integrate the data of the IMU sensor gyroscope between the N consecutive key frames to obtain the rotation of the IMU sensor between the N key frames Transformation, by establishing the minimum error equation of the rotation transformation of the RGBD sensor and the IMU sensor between N key frames, the bias b g of the IMU sensor gyroscope is estimated:
- N represents the number of key frames
- ⁇ R i,i+1 is the rotation matrix obtained by IMU integration between consecutive key frames i and i+1
- the Jacobian matrix that represents the first-order approximation of the ⁇ R variation equation with the gyroscope bias of the IMU sensor Is the rotation of the RGBD sensor coordinate system relative to the world coordinate system obtained by the visual ORBSLAM2 algorithm of the key frame i, Represents the rotation matrix of the sensor coordinate system converted from the world coordinate system to the IMU at the key frame i+1
- R CB is the calibration rotation matrix converted from the calibrated IMU sensor to the RGBD sensor coordinate system
- the gyroscope bias of the IMU sensor is estimated through two key frames, so N takes 2;
- ⁇ xy is In the second-order vector
- ⁇ is the gravity deviation vector
- b a represents the initial IMU sensor accelerometer bias
- R WI represents the direction of the IMU sensor in the world coordinate system
- ⁇ t 12 and ⁇ t 23 respectively represent key frames 1, 2 to key
- G represents the gravity reference value
- [] (:,1:2) means that only the first two columns of the matrix are included, Represents the rotation matrix of the IMU sensor coordinate system relative to the world coordinate system at key frames 1 and 2, respectively.
- ⁇ p 12 and ⁇ p 23 represent the displacement of the IMU sensor from key frames 1, 2 to key frames 2, 3 respectively, and ⁇ v 23 represents the key frame The speed change from 2 to key frame 3, They are the first-order approximation Jacobian matrix of ⁇ v 23 , ⁇ p 23 , and ⁇ p 12 with the IMU sensor accelerometer bias change equation, Respectively represent the rotation matrix of the RGBD sensor coordinate system C relative to the world coordinate system at key frames 1, 2, and 3, Respectively represent the coordinates of the center of the RGBD sensor coordinate system in the world coordinate system at key frames 1, 2, and 3. Indicates the coordinates of the center of the IMU sensor coordinate system in the RGBD sensor coordinate system;
- the velocity v of the IMU sensor is calculated as the initial IMU sensor velocity.
- the state of the mobile robot is the pose information, speed and bias of the IMU sensor of the mobile robot:
- E proj (k,j) is the reprojection error of the pose information and feature points, expressed as follows:
- E IMU (i,j) is the error term of the IMU sensor, expressed as follows:
- k represents the feature point number of the image
- i represents the key frame number
- j represents the current key frame number
- s represents the scale factor of the depth map
- u k represents the two-dimensional coordinates of the RGBD sensor camera coordinates corresponding to the k-th feature point
- K represents the internal parameter matrix of the RGBD sensor
- ⁇ ⁇ represents the anti-symmetric matrix of the pose information ⁇ of the RGBD sensor
- ⁇ k is the scale information matrix of the feature point X k and the corresponding pixel point
- R CB is the calibration of the calibrated IMU sensor converted to the RGBD sensor coordinate system
- Rotation matrix Represents the rotation matrix of the world coordinate system relative to the IMU sensor coordinate system at the j-th key frame time
- ⁇ represents the set of objects that need to be optimized
- ⁇ * represents the set of ⁇ objects after optimization
- k represents the sequence number of the feature point
- j represents the current key frame sequence number
- ⁇ represents the set of objects that need to be optimized
- ⁇ * represents the set of ⁇ objects after optimization
- k represents the sequence number of the feature point
- i represents the key frame sequence number
- j represents the current key frame sequence number
- ⁇ represents the set of objects to be optimized
- ⁇ * represents the set of ⁇ objects after optimization.
- the size of M changes with the time interval between the current key frame and the previous key frame, and M changes according to the number of key frames within 5 s from the current key frame.
- a non-linear optimization algorithm based on a sliding window optimizes the mobile robot's pose information, feature points, and IMU sensor error terms, and outputs the pose of the mobile robot in the current key frame.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Automation & Control Theory (AREA)
- Aviation & Aerospace Engineering (AREA)
- Multimedia (AREA)
- Electromagnetism (AREA)
- Optics & Photonics (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Navigation (AREA)
- Position Fixing By Use Of Radio Waves (AREA)
Abstract
Description
Claims (7)
- 一种基于RGBD传感器和IMU传感器的定位方法,应用于搭载RGBD传感器和IMU传感器的移动机器人,实现对移动机器人的定位,其特征在于,所述的定位方法包括以下步骤:S1、传感器数据采集:移动机器人搭载RGBD传感器和IMU传感器在三维空间中自由移动,采集RGBD传感器数据和IMU传感器数据;S2、IMU预积分:采用视觉SLAM算法根据RGBD传感器不同帧的图像数据质量选定关键帧,基于选定的关键帧,采用IMU预积分方程计算关键帧之间的IMU传感器数据的预积分值;S3、移动机器人初始化:采用视觉SLAM算法对RGBD传感器初始化得到RGBD传感器的位姿信息,然后根据RGBD传感器的位姿信息和对应关键帧的IMU预积分对移动机器人初始化获取移动机器人的初始化状态信息,其中,所述的移动机器人的状态信息包括移动机器人的位姿信息、速度和IMU传感器的偏置;S4、移动机器人状态的跟踪优化:采用基于重投影误差的优化算法对估计的移动机器人的状态信息进行优化跟踪,然后采用基于滑动窗口的优化算法对移动机器人的状态进行非线性优化;S5、回环检测:根据RGBD传感器数据,采用基于词袋模型的回环优化算法对RGBD传感器的位姿信息和特征点进行回环优化,然后采用基于滑动窗口的优化算法对IMU传感器误差项进行优化。
- 根据权利要求1所述的一种基于RGBD传感器和IMU传感器的定位方法,其特征在于,所述的IMU传感器的采集频率大于所述的RGBD传感器采集频率。
- 根据权利要求1所述的一种基于RGBD传感器和IMU传感器的定位方法,其特征在于,所述的步骤S2中采用视觉SLAM算法选定关键帧的规则如下:根据RGBD传感器所采集的当前图像帧与上一关键帧之间的帧数间隔,以及当前图像帧与之前的关键帧共视特征点的数量进行选定。
- 根据权利要求1所述的一种基于RGBD传感器和IMU传感器的定位方法,其特征在于,所述的步骤S2中计算IMU预积分,通过IMU预积分由当前时刻的状态推算出下一时刻的状态,计算公式如下:其中, 表示IMU传感器的坐标系B在i+1时刻旋转到世界坐标系的旋转矩阵,v i+1、p i+1分别表示IMU传感器坐标系B在i+1时刻在世界坐标系下的速度和位姿信息,Δt i,i+1表示IMU传感器i时刻到i+1时刻的间隔, 分别表示在i时刻IMU传感器陀螺仪偏置和加速度计偏置,g w表示在世界坐标系下的重力加速度,ΔR i,i+1、Δv i,i+1、Δp i,i+1分别表示i时刻到i+1时刻IMU传感器的旋转矩阵、速度变换和位移变换的预积分, 分别表示ΔR、Δv、Δp随IMU传感器陀螺仪偏置 变化方程的一阶近似的雅克比矩阵, 分别为Δv、Δp随IMU传感器加速度计偏置 变化方程的一阶近似的雅克比矩阵。
- 根据权利要求1所述的一种基于RGBD传感器和IMU传感器的定位方法,其特征在于,所述的步骤S3包括:S31、采用视觉SLAM算法对RGBD传感器进行初始化,获取RGBD传感器的位姿信息,过程如下:S311、采用ORB特征提取算法,提取RGBD传感器的图像数据的ORB特征点,根据ORB特征点对应的RGBD传感器的深度数据找出对应的三维空间点,从而根据ORB特征点像素坐标和空间点三维坐标获得RGBD传感器的位姿信息;S312、对于某空间点X k=[U k,V k,W k] T,其在RGBD传感器坐标系下的像素坐标为u k=[x k,y k],空间点坐标与像素坐标对应关系为:su k=Kexp(ξ^)X k其中s表示深度图的尺度因子,K表示RGBD传感器的内参矩阵,ξ表示估计位姿信息,ξ^表示位姿信息所对应的反对称矩阵,k表示特征点的序号;S313、通过构建最小二乘问题,优化RGBD传感器的位姿信息,使其误差最小:其中ξ *表示优化后的最优位姿信息,ξ表示估计的位姿信息,n表示特征点的个数;S32、根据视觉SLAM算法选定的N个连续关键帧,采用IMU预积分方程对N个连续关键帧之间的IMU传感器陀螺仪的数据进行积分,得到IMU传感器在N个关键帧之间的旋转变换,通过建立N个关键帧之间的RGBD传感器和IMU传感器旋转变换的最小误差方程,对IMU传感器陀螺仪的偏置b g进行估计:其中N表示关键帧的个数,ΔR i,i+1为关键帧i和i+1间的IMU积分所得的旋转矩阵, 表示ΔR随IMU传感器陀螺仪偏置变化方程的一阶近似的雅克比矩阵, 是关键帧i由视觉SLAM算法得到的RGBD传感器坐标系相对于世界坐标系的旋转, 表示关键帧i+1世界坐标系转换到IMU传感器坐标系的旋转矩阵,R CB为标定的IMU传感器转换到RGBD传感器坐标系下的标定旋转矩阵;S33、利用上述IMU传感器的陀螺仪偏置b g估计的结果,对IMU传感器的加速度计偏置、重力向量进行估计:S331、根据三个连续关键帧之间的IMU预积分方程的递推关系,以及IMU传感器坐标系和RGBD传感器坐标系的转换方程,得到线性方程:其中其中δθ xy为 中的二阶向量,δθ为重力偏差向量,b a表示初始化的IMU传感器加速度计偏置,R WI表示IMU传感器在世界坐标系的方向,Δt 12、Δt 23分别表示关键帧1、2到关键帧2、3的IMU传感器的时间间隔,G表示重力参考值, 表示在IMU传感器坐标系下的重力向量, 表示重力向量 的一阶近似,[] (:,1:2)表示只包 含该矩阵前两列数据, 分别表示关键帧1、2时刻IMU传感器坐标系相对于世界坐标系的旋转矩阵,Δp 12、Δp 23分别表示关键帧1、2到关键帧2、3的IMU传感器的位移,Δv 23表示关键帧2到关键帧3的速度变化, 分别为Δv 23、Δp 23、Δp 12随IMU传感器加速度计偏置变化方程的一阶近似的雅克比矩阵, 分别表示关键帧1、2、3时刻的RGBD传感器坐标系C相对于世界坐标系的旋转矩阵, 分别表示关键帧1、2、3时刻RGBD传感器坐标系中心在世界坐标系下的坐标, 表示IMU传感器坐标系中心在RGBD传感器坐标系下的坐标;S332、通过奇异值分解求解出重力偏差二阶向量δθ xy和加速度计偏置b a;S333、通过方程:
- 根据权利要求1所述的一种基于RGBD传感器和IMU传感器的定位方法,其特征在于,所述的步骤S4过程如下:S41、采用视觉SLAM算法估计RGBD传感器的位姿信息ξ;S42、建立关键帧之间的约束关系,根据IMU传感器误差项和移动机器人的位姿信息,基于重投影误差建立最小二乘关系:E proj(k,j)为位姿信息和特征点的重投影误差,表述式如下:E IMU(i,j)为IMU传感器的误差项,表述式如下:e b=b j-b i其中k表示图像的特征点序号,i表示关键帧序号,j表示当前关键帧序号,s表示深度图的尺度因子,u k表示第k个特征点对应的RGBD传感器相机坐标下的二维坐标,K表示RGBD传感器内参矩阵,ξ^表示RGBD传感器位姿信息ξ的反对称矩阵, 表示第k个特征点X k在RGBD传感器坐标系C下的坐标,Σ k是特征点X k和对应像素点的尺度信息矩阵,R CB为标定的IMU传感器转换到RGBD传感器坐标系下的标定旋转矩阵, 表示在第j关键帧时刻世界坐标系相对于IMU传感器坐标系的旋转矩阵, 表示第k个特征点X k在世界坐标系W下的坐标, 表示第j关键帧时刻IMU传感器在世界坐标系下的坐标, 表示IMU传感器坐标系中心在RGBD传感器坐标系下的坐标,ΔR i,j、Δv i,j、Δp i,j分别表示关键帧i到关键帧j之间的IMU传感器的旋转矩阵、速度变换和位移变换的预积分, 分别表示ΔR、Δv、Δp随IMU传感器陀螺仪偏置变化方程的一阶近似的雅克比矩阵, 分别为Δv、Δp随IMU传感器加速度计偏置变化方程的一阶近似的雅克比矩阵, 表示IMU传感器在第j关键帧时刻的速度, 表示IMU传感器的坐标系B在关键帧j时旋转到世界坐标系的旋转矩阵, 分别表示在第j关键帧时刻IMU传感器陀螺仪偏置和加速度计偏置,Σ R、Σ I分别为IMU传感器偏置随即游走和预积分的信息矩阵,b i、b j分别表示第i和第j关键帧的IMU传感器偏置,Δt i,j表示第i关键帧到第j关键帧的时间差;S43、对于当前关键帧j的移动机器人的状态跟踪,当前关键帧j仅与相邻关键帧j-1建立约束关系:其中φ表示需要优化对象的集合,φ *表示优化后的φ对象集合,k表示特征点的序号,j表示当前关键帧序号, 表示IMU传感器的坐标系B在关键帧j时刻旋转到世界坐标系的旋转矩阵, 表示IMU传感器关键帧j时刻在世界坐标系下的平移, 表 示IMU传感器在第j关键帧时刻的速度, 分别表示在关键帧j时刻IMU传感器陀螺仪偏置和加速度计偏置;求解出当前关键帧的移动机器人位姿信息和IMU传感器速度、偏置的最优值作为移动机器人的状态跟踪;S44、采用基于滑动窗口的优化方式对移动机器人的状态进行非线性优化,对于当前关键帧之前的M帧关键帧,通过对移动机器人的位姿信息、特征点和IMU传感器误差项建立约束关系进行优化:其中φ表示需要优化对象的集合,φ *表示优化后的φ对象集合,k表示特征点的序号,i表示关键帧序号,j表示当前关键帧序号, 表示IMU传感器的坐标系B在关键帧j时刻旋转到世界坐标系的旋转矩阵, 表示IMU传感器关键帧j时刻在世界坐标系下的平移, 表示IMU传感器在第j关键帧时刻的速度, 分别表示在关键帧j时刻IMU传感器陀螺仪偏置和加速度计偏置,对于当前关键帧M+1关键帧前的关键帧,仅对与当前关键帧具有共视点的关键帧位姿信息和特征点建立约束关系进行优化:其中ψ表示需要优化对象的集合,ψ *表示优化后的ψ对象集合。
- 根据权利要求1所述的一种基于RGBD传感器和IMU传感器的定位方法,其特征在于,所述的步骤S5过程如下:S51、采用基于词袋模型的回环检测算法对当前关键帧的RGBD传感器采集图像进行判断是否有回环;S52、如果未检测到回环,输出当前关键帧的移动机器人的位姿,结束回环优化步骤,然后对后续的移动机器人的状态进行跟踪;如果检测到回环,采用视觉SLAM算法对RGBD传感器位姿信息和特征点进行回环优化;S53、基于滑动窗口的非线性优化算法对移动机器人位姿信息、特征点和IMU传感器误差项进行优化,输出当前关键帧的移动机器人的位姿。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB2116617.8A GB2597632A (en) | 2020-03-11 | 2021-03-10 | RGBD sensor and IMU sensor-based positioning method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010166801.0 | 2020-03-11 | ||
CN202010166801.0A CN111462231B (zh) | 2020-03-11 | 2020-03-11 | 一种基于rgbd传感器和imu传感器的定位方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021180128A1 true WO2021180128A1 (zh) | 2021-09-16 |
Family
ID=71680707
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/080007 WO2021180128A1 (zh) | 2020-03-11 | 2021-03-10 | 一种基于rgbd传感器和imu传感器的定位方法 |
Country Status (3)
Country | Link |
---|---|
CN (1) | CN111462231B (zh) |
GB (1) | GB2597632A (zh) |
WO (1) | WO2021180128A1 (zh) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114234967A (zh) * | 2021-12-16 | 2022-03-25 | 浙江大学 | 一种基于多传感器融合的六足机器人定位方法 |
CN114549656A (zh) * | 2022-02-14 | 2022-05-27 | 希姆通信息技术(上海)有限公司 | Ar眼镜相机与imu的标定方法 |
CN114913245A (zh) * | 2022-06-08 | 2022-08-16 | 上海鱼微阿科技有限公司 | 一种基于无向有权图的多标定块多摄像头标定方法及*** |
CN115855042A (zh) * | 2022-12-12 | 2023-03-28 | 北京自动化控制设备研究所 | 一种基于激光雷达协同辅助的行人视觉导航方法 |
CN116442248A (zh) * | 2023-06-19 | 2023-07-18 | 山东工程职业技术大学 | 一种基于目标检测的机器人视觉定位模块及方法 |
CN116883502A (zh) * | 2023-09-05 | 2023-10-13 | 深圳市智绘科技有限公司 | 相机位姿和路标点位置的确定方法、装置、介质及设备 |
CN117272593A (zh) * | 2023-08-24 | 2023-12-22 | 无锡北微传感科技有限公司 | 一种风洞试验数据分析处理方法 |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111462231B (zh) * | 2020-03-11 | 2023-03-31 | 华南理工大学 | 一种基于rgbd传感器和imu传感器的定位方法 |
CN112085776B (zh) * | 2020-07-31 | 2022-07-19 | 山东科技大学 | 一种直接法无监督单目图像场景深度估计方法 |
CN112179373A (zh) * | 2020-08-21 | 2021-01-05 | 同济大学 | 一种视觉里程计的测量方法及视觉里程计 |
CN112461228B (zh) * | 2020-11-03 | 2023-05-09 | 南昌航空大学 | 一种相似环境下基于imu和视觉的二次回环检测定位方法 |
CN113159197A (zh) * | 2021-04-26 | 2021-07-23 | 北京华捷艾米科技有限公司 | 一种纯旋转运动状态判定方法及装置 |
CN113610001B (zh) * | 2021-08-09 | 2024-02-09 | 西安电子科技大学 | 基于深度相机和imu组合的室内移动终端定位方法 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108846867A (zh) * | 2018-08-29 | 2018-11-20 | 安徽云能天智能科技有限责任公司 | 一种基于多目全景惯导的slam*** |
CN109307508A (zh) * | 2018-08-29 | 2019-02-05 | 中国科学院合肥物质科学研究院 | 一种基于多关键帧的全景惯导slam方法 |
CN109993113A (zh) * | 2019-03-29 | 2019-07-09 | 东北大学 | 一种基于rgb-d和imu信息融合的位姿估计方法 |
US20190294903A1 (en) * | 2018-03-20 | 2019-09-26 | Vangogh Imaging, Inc. | 3d vision processing using an ip block |
CN110345944A (zh) * | 2019-05-27 | 2019-10-18 | 浙江工业大学 | 融合视觉特征和imu信息的机器人定位方法 |
CN111462231A (zh) * | 2020-03-11 | 2020-07-28 | 华南理工大学 | 一种基于rgbd传感器和imu传感器的定位方法 |
-
2020
- 2020-03-11 CN CN202010166801.0A patent/CN111462231B/zh active Active
-
2021
- 2021-03-10 GB GB2116617.8A patent/GB2597632A/en active Pending
- 2021-03-10 WO PCT/CN2021/080007 patent/WO2021180128A1/zh active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190294903A1 (en) * | 2018-03-20 | 2019-09-26 | Vangogh Imaging, Inc. | 3d vision processing using an ip block |
CN108846867A (zh) * | 2018-08-29 | 2018-11-20 | 安徽云能天智能科技有限责任公司 | 一种基于多目全景惯导的slam*** |
CN109307508A (zh) * | 2018-08-29 | 2019-02-05 | 中国科学院合肥物质科学研究院 | 一种基于多关键帧的全景惯导slam方法 |
CN109993113A (zh) * | 2019-03-29 | 2019-07-09 | 东北大学 | 一种基于rgb-d和imu信息融合的位姿估计方法 |
CN110345944A (zh) * | 2019-05-27 | 2019-10-18 | 浙江工业大学 | 融合视觉特征和imu信息的机器人定位方法 |
CN111462231A (zh) * | 2020-03-11 | 2020-07-28 | 华南理工大学 | 一种基于rgbd传感器和imu传感器的定位方法 |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114234967A (zh) * | 2021-12-16 | 2022-03-25 | 浙江大学 | 一种基于多传感器融合的六足机器人定位方法 |
CN114234967B (zh) * | 2021-12-16 | 2023-10-20 | 浙江大学 | 一种基于多传感器融合的六足机器人定位方法 |
CN114549656A (zh) * | 2022-02-14 | 2022-05-27 | 希姆通信息技术(上海)有限公司 | Ar眼镜相机与imu的标定方法 |
CN114913245A (zh) * | 2022-06-08 | 2022-08-16 | 上海鱼微阿科技有限公司 | 一种基于无向有权图的多标定块多摄像头标定方法及*** |
CN115855042A (zh) * | 2022-12-12 | 2023-03-28 | 北京自动化控制设备研究所 | 一种基于激光雷达协同辅助的行人视觉导航方法 |
CN116442248A (zh) * | 2023-06-19 | 2023-07-18 | 山东工程职业技术大学 | 一种基于目标检测的机器人视觉定位模块及方法 |
CN117272593A (zh) * | 2023-08-24 | 2023-12-22 | 无锡北微传感科技有限公司 | 一种风洞试验数据分析处理方法 |
CN117272593B (zh) * | 2023-08-24 | 2024-04-05 | 无锡北微传感科技有限公司 | 一种风洞试验数据分析处理方法 |
CN116883502A (zh) * | 2023-09-05 | 2023-10-13 | 深圳市智绘科技有限公司 | 相机位姿和路标点位置的确定方法、装置、介质及设备 |
CN116883502B (zh) * | 2023-09-05 | 2024-01-09 | 深圳市智绘科技有限公司 | 相机位姿和路标点位置的确定方法、装置、介质及设备 |
Also Published As
Publication number | Publication date |
---|---|
GB2597632A (en) | 2022-02-02 |
CN111462231A (zh) | 2020-07-28 |
CN111462231B (zh) | 2023-03-31 |
GB202116617D0 (en) | 2022-01-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021180128A1 (zh) | 一种基于rgbd传感器和imu传感器的定位方法 | |
CN111156998B (zh) | 一种基于rgb-d相机与imu信息融合的移动机器人定位方法 | |
CN109676604B (zh) | 机器人曲面运动定位方法及其运动定位*** | |
CN106052584B (zh) | 一种基于视觉及惯性信息融合的轨道空间线形测量方法 | |
CN108036785A (zh) | 一种基于直接法与惯导融合的飞行器位姿估计方法 | |
CN103994763B (zh) | 一种火星车的sins/cns深组合导航***及其实现方法 | |
CN108981693B (zh) | 基于单目相机的vio快速联合初始化方法 | |
Cheviron et al. | Robust nonlinear fusion of inertial and visual data for position, velocity and attitude estimation of UAV | |
CN109141433A (zh) | 一种机器人室内定位***及定位方法 | |
Kang et al. | Vins-vehicle: A tightly-coupled vehicle dynamics extension to visual-inertial state estimator | |
CN106289246A (zh) | 一种基于位置和姿态测量***的柔性杆臂测量方法 | |
CN106525049A (zh) | 一种基于计算机视觉的四足机器人本***姿跟踪方法 | |
CN113551665B (zh) | 一种用于运动载体的高动态运动状态感知***及感知方法 | |
CN114234967B (zh) | 一种基于多传感器融合的六足机器人定位方法 | |
CN115574816B (zh) | 仿生视觉多源信息智能感知无人平台 | |
CN109764870B (zh) | 基于变换估计量建模方案的载体初始航向估算方法 | |
CN107782309A (zh) | 非惯性系视觉和双陀螺仪多速率ckf融合姿态测量方法 | |
CN109282810A (zh) | 一种惯导和角度传感器融合的蛇形机器人姿态估计方法 | |
CN115371665A (zh) | 一种基于深度相机和惯性融合的移动机器人定位方法 | |
Karam et al. | Integrating a low-cost mems imu into a laser-based slam for indoor mobile mapping | |
CN108827287A (zh) | 一种复杂环境下的鲁棒视觉slam*** | |
Khosravian et al. | A discrete-time attitude observer on SO (3) for vision and GPS fusion | |
Luo et al. | End‐Effector Pose Estimation in Complex Environments Using Complementary Enhancement and Adaptive Fusion of Multisensor | |
Zhang et al. | Research on UAV attitude data fusion algorithm based on quaternion gradient descent | |
Yu et al. | Research on Robot Positioning Technology Based on Multi Sensor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21768751 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 202116617 Country of ref document: GB Kind code of ref document: A Free format text: PCT FILING DATE = 20210310 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 14/02/2023) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21768751 Country of ref document: EP Kind code of ref document: A1 |