CN114413894A - Multi-sensor fusion robot positioning method - Google Patents

Multi-sensor fusion robot positioning method Download PDF

Info

Publication number
CN114413894A
CN114413894A CN202210169267.8A CN202210169267A CN114413894A CN 114413894 A CN114413894 A CN 114413894A CN 202210169267 A CN202210169267 A CN 202210169267A CN 114413894 A CN114413894 A CN 114413894A
Authority
CN
China
Prior art keywords
positioning
robot
laser radar
point cloud
positioning result
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202210169267.8A
Other languages
Chinese (zh)
Inventor
高钦发
高明
王建华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong New Generation Information Industry Technology Research Institute Co Ltd
Original Assignee
Shandong New Generation Information Industry Technology Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong New Generation Information Industry Technology Research Institute Co Ltd filed Critical Shandong New Generation Information Industry Technology Research Institute Co Ltd
Priority to CN202210169267.8A priority Critical patent/CN114413894A/en
Publication of CN114413894A publication Critical patent/CN114413894A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • G01S19/485Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an optical system or imaging system
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • G01S19/49Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an inertial position system, e.g. loosely-coupled

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides a multi-sensor fusion robot positioning method. Firstly, establishing a point cloud map of a robot running area by using a laser radar SLAM algorithm; and calculating the current coordinate and orientation by using the double GPS as the initial pose of the point cloud registration of the laser radar, and then performing accurate initial positioning by using the point cloud registration of the laser radar. When the robot runs, the laser radar point cloud is accurately positioned, and meanwhile, the robot wheel type odometer and the IMU are used for motion estimation. And evaluating the positioning result of the laser radar by using a positioning decision mechanism, and if the positioning result does not meet the requirement, adopting the positioning results of the wheel type odometer and the IMU. The robot can be accurately positioned, is stable and reliable, can stably operate, is low in cost, is suitable for positioning in multiple scenes and indoors and outdoors, and can stably operate in urban areas with serious sheltering or open suburban environments.

Description

Multi-sensor fusion robot positioning method
Technical Field
The invention relates to a multi-sensor fusion robot positioning method in the field of robots, which is used for realizing accurate positioning of a robot in various application scenes.
Background
When the robot works, positioning is the first choice, and accurate determination of the current position of the robot is the basis for stable operation of the robot. If the positioning of a single laser radar sensor is only relied on, and the registration is carried out on a point cloud map, the positioning precision is higher, but when the environment is complex and changeable, especially when similar scenes exist in the map, the laser radar is shielded, and the operating environment and the mapping environment have the problems of great change and the like, the positioning is easy to lose. Some rely on RTK for positioning, but need to establish a base station separately or purchase service, which is costly. Therefore, a plurality of sensors are required to be fused to participate in positioning, and the stability and accuracy of positioning are improved.
Disclosure of Invention
The invention aims to provide a multi-sensor fusion robot positioning method, which uses information of sensors such as a multi-line laser radar, a wheel type odometer, an IMU (inertial measurement unit), a GPS (global positioning system) and the like to realize accurate positioning of a robot, is stable and reliable, has low cost and is suitable for multi-scene indoor and outdoor positioning.
In order to achieve the purpose, the invention is realized by the following technical scheme:
step 1, establishing a spatial three-dimensional point cloud map. Establishing a point cloud map of a robot running area by using a laser radar SLAM algorithm;
and 2, initially positioning by using double GPS. Two GPS are arranged at the head and the tail of the robot at a certain interval, the head is a main GPS, and the tail is an auxiliary GPS. Calculating the current coordinate and orientation as an initial pose, wherein the initial pose has a certain deviation;
and 3, accurately positioning the laser radar. Taking the initial pose in the step two as an initial value of point cloud registration, and obtaining the accurate positioning of the robot by using a point cloud registration algorithm;
and 4, estimating the motion of the robot wheel type odometer and the IMU. In the small time difference between two frames of the laser radar, the robot moves approximately in a uniform motion and a uniform circular motion. Linear velocity is provided by a wheel odometer and angular velocity is provided by the IMU. And obtaining the estimation of the robot motion in the tiny time interval as a predicted value of positioning and an initial value of the next laser radar registration. (ii) a
And 5, precisely positioning by using the laser radar. Using the location predicted values of the odometer and the IMU as initial values of point cloud registration of the laser radar, and operating a point cloud registration program to obtain a location result of the laser radar;
and 6, repeating the steps 4-5 until all positioning results are obtained.
Preferably, the method for registering point clouds in step 3 and step 5 is an NDT algorithm.
Preferably, if the positioning result of the laser radar meets any one of the following conditions, the laser radar is considered to be positioned wrongly, the positioning result of the laser radar is discarded, and the positioning predicted values of the odometer and the IMU are selected as the final positioning result.
(1) The laser radar positioning result and the last positioning result are obtained, the position change of the robot is larger than a set threshold value, and the threshold value is set as the time interval between the maximum linear speed of the robot and two frames of point clouds of the laser radar;
(2) the orientation change of the robot is larger than a set threshold value according to the positioning result of the laser radar and the last positioning result, and the threshold value is set as the time interval between the maximum angular speed of the robot and two frames of point clouds of the laser radar;
(3) the laser radar is shielded, the number of the collected point cloud points is less than a specific threshold value, and the threshold value is set to be 5;
(4) the point cloud registration score is less than a specified threshold, which is set to 0.1.
The invention has the advantages that: the method for positioning the robot by fusing the multiple sensors realizes accurate positioning of the robot by using information of the sensors such as the multi-line laser radar, the wheel type odometer, the IMU, the GPS and the like which are arranged on a robot body, is stable and reliable, has low cost and is suitable for positioning in multiple scenes and indoors and outdoors.
Drawings
FIG. 1 is a schematic view of the flow structure of the present invention.
Detailed Description
A method of multi-sensor fused robot localization comprising the steps of:
step 1, establishing a spatial three-dimensional point cloud map. Establishing a point cloud map of a robot running area by using a laser radar SLAM algorithm;
and 2, initially positioning by using double GPS. Two GPS are arranged at the head and the tail of the robot at a certain interval, the head is a main GPS, and the tail is an auxiliary GPS. Calculating the current coordinate and orientation as an initial pose, wherein the initial pose has a certain deviation;
and 3, accurately positioning the laser radar. Taking the initial pose in the step two as an initial value of point cloud registration, and obtaining the accurate positioning of the robot by using a point cloud registration algorithm;
and 4, estimating the motion of the robot wheel type odometer and the IMU. In the small time difference between two frames of the laser radar, the robot moves approximately in a uniform motion and a uniform circular motion. Linear velocity is provided by a wheel odometer and angular velocity is provided by the IMU. And obtaining the estimation of the robot motion in the tiny time interval as a predicted value of positioning and an initial value of the next laser radar registration. (ii) a
And 5, precisely positioning by using the laser radar. Using the location predicted values of the odometer and the IMU as initial values of point cloud registration of the laser radar, and operating a point cloud registration program to obtain a location result of the laser radar;
the lidar positioning decision mechanism is as follows:
evaluating the positioning result of the laser radar, if any one of the following conditions is met, determining that the laser radar is positioned wrongly,
(1) the laser radar positioning result and the last positioning result are obtained, the position change of the robot is larger than a set threshold value, and the threshold value is set as the time interval between the maximum linear speed of the robot and two frames of point clouds of the laser radar;
(2) the orientation change of the robot is larger than a set threshold value according to the positioning result of the laser radar and the last positioning result, and the threshold value is set as the time interval between the maximum angular speed of the robot and two frames of point clouds of the laser radar;
(3) the laser radar is shielded, the number of the collected point cloud points is less than a specific threshold value, and the threshold value is set to be 5;
(4) the point cloud registration score is less than a specified threshold, which is set to 0.1.
And if the result exists, discarding the laser radar positioning result, and selecting the positioning predicted values of the odometer and the IMU as a final positioning result. (ii) a
And 6, repeating the steps 4-5 until all positioning results are obtained.

Claims (3)

1. A method of multi-sensor fused robot localization, comprising the steps of:
step 1, establishing a spatial three-dimensional point cloud map, and establishing the point cloud map of a robot running area by using a laser radar SLAM algorithm;
step 2, using double GPS for initial positioning, placing two GPS at the head and the tail of the robot in a distance larger than 0.8m, wherein the head is a main GPS, and the tail is an auxiliary GPS, and calculating the current coordinate and orientation as an initial pose;
step 3, accurately positioning the laser radar, taking the initial pose in the step two as an initial value of point cloud registration, and obtaining the accurate positioning of the robot by using a point cloud registration algorithm;
step 4, estimating the motion of the robot wheel type odometer and the IMU, wherein the motion of the robot is approximately uniform motion and uniform circular motion within two frame time difference of the laser radar, the linear velocity is provided by the wheel type odometer, the angular velocity is provided by the IMU, and the estimation of the motion of the robot within the tiny time interval is obtained and is used as a predicted value of positioning and is also an initial value of the next laser radar registration;
step 5, performing laser radar fine positioning, using the positioning predicted values of the odometer and the IMU as initial values of laser radar point cloud registration, and operating a point cloud registration program to obtain a positioning result of the laser radar;
and 6, repeating the steps 4-5 until all positioning results are obtained.
2. The method for multi-sensor fused robot localization according to claim 1, wherein the point cloud registration algorithm in steps 3 and 5 is an NDT registration algorithm.
3. The method for positioning a multi-sensor fusion robot according to claim 1, wherein the positioning result of the lidar is considered to be an error if any one of the following conditions is met, the positioning result of the lidar is discarded, and the positioning predicted values of the odometer and the IMU are selected as a final positioning result;
(1) the laser radar positioning result and the last positioning result are obtained, the position change of the robot is larger than a set threshold value, and the threshold value is set as the time interval between the maximum linear speed of the robot and two frames of point clouds of the laser radar;
(2) the orientation change of the robot is larger than a set threshold value according to the positioning result of the laser radar and the last positioning result, and the threshold value is set as the time interval between the maximum angular speed of the robot and two frames of point clouds of the laser radar;
(3) the laser radar is shielded, the number of the collected point cloud points is less than a specific threshold value, and the threshold value is set to be 5;
(4) the point cloud registration score is less than a specified threshold, which is set to 0.1.
CN202210169267.8A 2022-02-24 2022-02-24 Multi-sensor fusion robot positioning method Withdrawn CN114413894A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210169267.8A CN114413894A (en) 2022-02-24 2022-02-24 Multi-sensor fusion robot positioning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210169267.8A CN114413894A (en) 2022-02-24 2022-02-24 Multi-sensor fusion robot positioning method

Publications (1)

Publication Number Publication Date
CN114413894A true CN114413894A (en) 2022-04-29

Family

ID=81261988

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210169267.8A Withdrawn CN114413894A (en) 2022-02-24 2022-02-24 Multi-sensor fusion robot positioning method

Country Status (1)

Country Link
CN (1) CN114413894A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117346768A (en) * 2023-11-03 2024-01-05 昆明理工大学 Multi-sensor fusion sensing positioning method suitable for indoor and outdoor

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117346768A (en) * 2023-11-03 2024-01-05 昆明理工大学 Multi-sensor fusion sensing positioning method suitable for indoor and outdoor
CN117346768B (en) * 2023-11-03 2024-04-19 昆明理工大学 Multi-sensor fusion sensing positioning method suitable for indoor and outdoor

Similar Documents

Publication Publication Date Title
US10845457B2 (en) Drone localization
CN108873908B (en) Robot city navigation system based on combination of visual SLAM and network map
EP3136128B1 (en) Trajectory matching using peripheral signal
CN110361027A (en) Robot path planning method based on single line laser radar Yu binocular camera data fusion
Schreiber et al. Laneloc: Lane marking based localization using highly accurate maps
JP2022106924A (en) Device and method for autonomous self-position estimation
CN103674015B (en) Trackless positioning navigation method and device
CN110243358A (en) The unmanned vehicle indoor and outdoor localization method and system of multi-source fusion
US11119192B2 (en) Automatic detection of overhead obstructions
CN109282808B (en) Unmanned aerial vehicle and multi-sensor fusion positioning method for bridge three-dimensional cruise detection
Chi et al. Automatic guidance of underground mining vehicles using laser sensors
CN103207634A (en) Data fusion system and method of differential GPS (Global Position System) and inertial navigation in intelligent vehicle
CN112882053B (en) Method for actively calibrating external parameters of laser radar and encoder
US20200240790A1 (en) Localization with Neural Network Based Image Registration of Sensor Data and Map Data
CN103487050A (en) Positioning method for indoor mobile robot
CN111025366B (en) Grid SLAM navigation system and method based on INS and GNSS
Wen et al. Multi-agent collaborative GNSS/camera/INS integration aided by inter-ranging for vehicular navigation in urban areas
CN110412596A (en) A kind of robot localization method based on image information and laser point cloud
CN114013427B (en) Parking data processing method for automatic parking test
CN115857504A (en) DWA-based robot local path planning method, equipment and storage medium in narrow environment
CN113189613A (en) Robot positioning method based on particle filtering
CN114413894A (en) Multi-sensor fusion robot positioning method
CN115728803A (en) System and method for continuously positioning urban driving vehicle
CN106843206A (en) Assisted location method based on existing road network
Lee et al. Adaptive localization for mobile robots in urban environments using low-cost sensors and enhanced topological map

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20220429