CN110488818A - A kind of robot localization method, apparatus and robot based on laser radar - Google Patents

A kind of robot localization method, apparatus and robot based on laser radar Download PDF

Info

Publication number
CN110488818A
CN110488818A CN201910731221.9A CN201910731221A CN110488818A CN 110488818 A CN110488818 A CN 110488818A CN 201910731221 A CN201910731221 A CN 201910731221A CN 110488818 A CN110488818 A CN 110488818A
Authority
CN
China
Prior art keywords
robot
positioning
map
variances
mean value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910731221.9A
Other languages
Chinese (zh)
Other versions
CN110488818B (en
Inventor
罗丹平
叶力荣
张国栋
闫瑞君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Silver Star Intelligent Group Co Ltd
Original Assignee
Shenzhen Silver Star Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Silver Star Intelligent Technology Co Ltd filed Critical Shenzhen Silver Star Intelligent Technology Co Ltd
Priority to CN201910731221.9A priority Critical patent/CN110488818B/en
Publication of CN110488818A publication Critical patent/CN110488818A/en
Application granted granted Critical
Publication of CN110488818B publication Critical patent/CN110488818B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The present embodiments relate to a kind of robot localization method, apparatus and robot based on laser radar, which comprises obtain the initial estimated location of the robot;Localized particle Filtering Model is obtained according to the initial estimated location and map;The obstacle information of the robot is obtained by the laser radar;The robot is positioned using the localized particle Filtering Model and the obstacle information, obtains position of the robot in the map.The robot localization method, apparatus and robot based on laser radar of the embodiment of the present invention, by obtaining the initial estimated location of the robot, obtaining localized particle Filtering Model using the initial estimated location and map, and robot is positioned using localized particle Filtering Model, since localized particle Filtering Model search range is small relative to Global localization search range, locating speed is fast.

Description

A kind of robot localization method, apparatus and robot based on laser radar
Technical field
The present embodiments relate to field of artificial intelligence, in particular to a kind of robot localization based on laser radar Method, apparatus and robot.
Background technique
It is deep to be liked by the mankind since robot can replace the mankind to be engaged in heavy housework.Robot is complete At user task when, need to move in circumstances not known.During the motion in order to realize autonomous positioning and navigation, structure is needed Increment type map is built, while being estimated from the position in map.
Realize process of the present invention in, inventor discovery at least there are the following problems in the related technology: robot by artificially from When one pose is removed to another pose, the pose of robot and posture are both relative to having occurred biggish variation before moving. At this moment if continuing to construct map, the figure of building that will occur to generate by pose problem is not allowed, builds figure overlap problem, it is therefore desirable to right Robot re-starts positioning.Robot is relocated currently, directlying adopt global localization method, but Global localization side Method since search range is wide, locating speed is slower.
Summary of the invention
The purpose of the embodiment of the present invention is that provide a kind of fast robot localization method based on laser radar of locating speed, Device and robot.
In a first aspect, the embodiment of the invention provides a kind of robot localization method based on laser radar, is used for machine People, the robot include laser radar, which comprises
Obtain the initial estimated location of the robot;
Localized particle Filtering Model is obtained according to the initial estimated location and map;
The obstacle information of the robot is obtained by the laser radar;
The robot is positioned using the localized particle Filtering Model and the obstacle information, described in acquisition Position of the robot in the map.
In some embodiments, the localized particle Filtering Model includes first partial particle filter model and the second part Particle filter model;
It is described that the robot is positioned using the localized particle Filtering Model and the obstacle information, it obtains Position of the robot in the map, comprising:
The robot is positioned using the first partial particle filter model and the obstacle information, is obtained First positioning result;
The robot is positioned using the described second local particle filter model and the obstacle information, is obtained Second positioning result;
First positioning result and second positioning result are merged, position of the robot in the map is obtained It sets.
In some embodiments, first positioning result includes the first positioning mean value and the first positioning variances, and described the Two positioning results include the second positioning mean value and the second positioning variances;
The fusion first positioning result and second positioning result, obtain the robot in the map Position, comprising:
If first positioning variances and second positioning variances are less than within a preset time or are equal to pre- If variance threshold values and it is described first positioning mean value and it is described second positioning mean value difference be less than or be equal within a preset time Preset difference value threshold value, then,
According to the first positioning mean value, the first positioning variances, the second positioning mean value and the acquisition of the second positioning variances Position of the robot in the map.
In some embodiments, the fusion first positioning result and second positioning result, obtain the machine Position of the device people in the map, further includes:
If first positioning variances or second positioning variances are not up to less than or equal to pre- within a preset time If variance threshold values, alternatively, first positioning variances and second positioning variances are less than or wait within a preset time The difference of the first positioning mean value and the second positioning mean value is not up to less than in default variance threshold values but within a preset time Or be equal to preset difference value threshold value, then,
Third overall situation particle filter model is obtained according to the map, and utilizes the third overall situation particle filter model pair The robot is positioned, and position of the robot in the map is obtained.
In some embodiments, described according to the first positioning mean value, the first positioning variances, the second positioning mean value and the Two positioning variances obtain position of the robot in the map, comprising:
The first positioning weight and the second positioning are obtained respectively according to first positioning variances and second positioning variances Weight;
The first positioning mean value and described second are determined according to first positioning weight and second positioning weight Position mean value is weighted and averaged, and obtains position of the robot in the map.
In some embodiments, the initial estimated location for obtaining the robot, comprising:
Home position and the original coordinate system of the robot are obtained, the home position is that the robot is artificially moved Position before dynamic;
The time that the robot artificially moves is divided at least two sampling time intervals;
In any sampling time interval, the pose according to the robot establishes local coordinate system, obtains the machine Displacement increment of the people under the local coordinate system, and the displacement increment under the local coordinate system is converted into the original seat Displacement increment under mark system;
According to position of the robot under the original coordinate system in the home position and each sampling time interval The initial estimated location moved increment, obtain the robot.
In some embodiments, the initial estimated location for obtaining the robot, further includes:
Obtain the angular speed and acceleration of the robot;
The pose for carrying out time integral in the sampling time interval to the angular speed, obtaining the robot;
The speed for carrying out time integral in the sampling time interval to the acceleration, obtaining the robot;
The robot is obtained under the local coordinate system, the sampling according to the velocity and acceleration of the robot Displacement increment in time interval.
In some embodiments, described to utilize the localized particle Filtering Model and the obstacle information to the machine People positions, comprising:
The weight of each particle in the localized particle Filtering Model is updated according to the obstacle information;
Positioning mean value and positioning are obtained according to the position of each particle in the weight and the localized particle Filtering Model Variance.
Second aspect, the embodiment of the invention provides a kind of robotic positioning devices based on laser radar, are used for machine People, the robot include laser radar, and described device includes:
Initial estimated location obtains module, for obtaining the initial estimated location of the robot;
Localized particle Filtering Model obtains module, for obtaining localized particle filter according to the initial estimated location and map Wave pattern;
Obstacle information obtains module, for obtaining the barrier of the robot by the laser radar Information;
Locating module, for being carried out using the localized particle Filtering Model and the obstacle information to the robot Positioning, obtains position of the robot in the map.
In some embodiments, the localized particle Filtering Model includes first partial particle filter model and the second part Particle filter model;
The locating module is used for:
The robot is positioned using the first partial particle filter model and the obstacle information, is obtained First positioning result;
The robot is positioned using the described second local particle filter model and the obstacle information, is obtained Second positioning result;
First positioning result and second positioning result are merged, position of the robot in the map is obtained It sets.
The third aspect, the embodiment of the invention provides a kind of robot, the robot includes:
Robot body;
Walking mechanism on the robot body;
Laser radar and Inertial Measurement Unit on the robot body, the laser radar are described for obtaining The obstacle information of robot;
Controller, the controller are built in the robot body, with the laser radar and the inertia measurement Unit connection;
The controller includes:
At least one processor, and
Memory, the memory are connect at least one described processor communication, and the memory is stored with can be by institute State the instruction of at least one processor execution, described instruction executed by least one described processor so that it is described at least one Processor is able to carry out above-mentioned method.
Fourth aspect, the embodiment of the invention provides a kind of non-volatile computer readable storage medium storing program for executing, the computers Readable storage medium storing program for executing is stored with computer executable instructions, when the computer executable instructions are executed by robot, makes institute It states robot and executes method as described above.
The robot localization method, apparatus and robot based on laser radar of the embodiment of the present invention, described in obtaining The initial estimated location of robot obtains localized particle Filtering Model using the initial estimated location and map, and utilizes office Portion's particle filter model positions robot, since localized particle Filtering Model search range is searched for relative to Global localization Range is small, therefore locating speed is fast.
Detailed description of the invention
One or more embodiments are illustrated by the picture in corresponding attached drawing, these exemplary theorys The bright restriction not constituted to embodiment, the element in attached drawing with same reference numbers label are expressed as similar element, remove Non- to have special statement, composition does not limit the figure in attached drawing.
Fig. 1 is an application scenarios schematic diagram of robot localization method and apparatus of the embodiment of the present invention;
Fig. 2 is the structural schematic diagram of one embodiment of robot of the present invention;
Fig. 3 is the flow diagram of one embodiment of robot localization method of the present invention;
Fig. 4 is original coordinate system and local coordinate system schematic diagram in one embodiment of robot localization method of the present invention;
Fig. 5 is that particle weights value schematic diagram is updated in one embodiment of robot localization method of the present invention;
Fig. 6 is the flow diagram of one embodiment of robot localization method of the present invention;
Fig. 7 is the structural schematic diagram of one embodiment of robotic positioning device of the present invention;
Fig. 8 is the hardware structural diagram of controller in one embodiment of robot of the present invention.
Specific embodiment
In order to make the object, technical scheme and advantages of the embodiment of the invention clearer, below in conjunction with the embodiment of the present invention In attached drawing, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described embodiment is A part of the embodiment of the present invention, instead of all the embodiments.Based on the embodiments of the present invention, those of ordinary skill in the art Every other embodiment obtained without creative efforts, shall fall within the protection scope of the present invention.
Robot localization method and apparatus provided in an embodiment of the present invention based on laser radar can be applied to shown in Fig. 1 Application scenarios.The application scenarios include robot 10.Wherein, robot 10 can be mobile robot, such as sweep the floor Robot, crusing robot, unmanned sampling robot, unmanned fork lift etc..
Robot 10 in the case that in order to complete user task or other, need to move in circumstances not known.It is transporting In order to realize autonomous positioning and navigation during dynamic, need to construct increment type map, while being positioned, i.e., estimation is from ground Position in figure.
When robot is artificially removed from a pose to another pose, the pose of robot and posture both relative to Biggish variation has occurred before moving.At this moment if continuing to construct map, will occur to generate by pose problem build figure it is inaccurate, Build figure overlap problem, it is therefore desirable to which positioning is re-started to robot.
If carrying out Global localization directly in map, since search range is wide, it is slower to will lead to locating speed.In this hair In bright embodiment, the initial estimated location that initial estimation obtains robot is first carried out, further according to the initial estimated location in map Middle acquisition localized particle Filtering Model, and robot is positioned using the localized particle Filtering Model.Wherein, localized particle Filtering Model includes at least two particles, and at least two particle is located near the initial estimated location in certain area.And Global localization, which refers to, positions robot using global particle filter model, and the particle in global particle filter model presses one The fixed regularity of distribution is located in the map.Since localized particle Filtering Model search range is relative to Global localization search range It is small, therefore locating speed is fast.
In order to keep the autonomous positioning of robot more acurrate, robot must be positioned oneself with reference to its external environment, Wherein, robot can be used external sensor (for example, ultrasound, laser, visual sensor etc.) and measure to environment.This hair In bright embodiment, robot detects the barrier data of ambient enviroment using included laser radar, and is filtered using localized particle Wave pattern and the barrier data position robot.
Wherein, in some embodiments, referring to figure 2., robot 10 includes robot body 11, laser radar 12, control Device 13, walking mechanism 14 and Inertial Measurement Unit 15 processed.Wherein, robot body 11 is the main structure of robot, Ke Yigen According to the actual needs of robot 10, corresponding shape and structure and manufacture material (such as rigid plastics or aluminium, iron metal) are selected, Such as it is set as the common more flat cylinder of sweeping robot.
Walking mechanism 14 is set on robot body 11, is to provide the constructional device of locomotivity for robot 10.It should Walking mechanism 14 can specifically be realized using any kind of mobile device, such as idler wheel, crawler type etc..Laser radar 12 is used for Perceive the barrier situation of robot, acquired disturbance object information.It in other embodiments, can also be using other biographies Sensor replaces laser radar, such as RGBD video camera etc..
In wherein some embodiments, Inertial Measurement Unit 15 includes gyroscope, accelerometer etc., for obtaining robot Angular speed and acceleration etc., and the position of robot is estimated according to the angular speed and acceleration, obtains robot Initial estimated location.In further embodiments, robot can also be obtained using other methods or other measuring units Initial estimated location.
Controller 13 is that the electronics being built in robot body 11 calculates core, for executing logic operation step with reality The intelligentized control method of existing robot 10.Controller 13 is connect with laser radar 12 and Inertial Measurement Unit 15, for according to laser The data that radar 12 and Inertial Measurement Unit 15 acquire execute preset algorithm and carry out map composition and robot localization.
It should be noted that according to being completed for task, other than above functional module group, on robot body 10 One or more other different functional module group (such as water tank, clearing apparatus) can also be carried, is cooperated to hold The corresponding task of row.
Fig. 3 is the flow diagram of the robot localization method provided in an embodiment of the present invention based on laser radar, described Method can be executed by robot 10 shown in fig. 1 or fig. 2 (specifically, in wherein some embodiments, by robot 10 Controller 13 executes), as shown in Figure 3, which comprises
101: obtaining the initial estimated location of the robot.
Wherein, the initial estimated location refers to the estimated value to " robot artificially moved after position ".Wherein In some embodiments, it can use Inertial Measurement Unit (Inertial measurement unit, IMU) and obtain robot Angular speed and acceleration, further according to the angular speed and acceleration obtain after robot is artificially moved relative to it is artificial it is mobile before Displacement, to obtain the initial estimated location of robot according to home position (i.e. artificial mobile before position).
Since robot is during reaching the position after artificial movement from home position, pose is constantly to change 's.Therefore, for ease of calculation, time robot artificially moved (i.e. from the artificial mobile preceding time to after artificial movement) It is divided into multiple sampling time interval Δ t, since Δ t is sufficiently small, acceleration, the speed, angle speed of robot is thought in Δ t Degree and direction are all constant.The displacement increment for calculating each Δ t inner machine people can be obtained robot by after artificial moved Relative to the displacement in home position, to obtain the initial estimated location of robot.Wherein, removed and put down can be with for robot Module is examined by the touch-switch or ground that are set in robot to realize.
Below with sampling time interval [t, t+ Δ t] for, illustrate obtain Δ t inner machine people displacement increment process. Referring to figure 4., P (0) indicates home position, and the original coordinate system of original position is X-Y-Z coordinate system, and P (t) indicates t moment The position of robot, the local coordinate system of t moment are X (t)-Y (t)-Z (t) coordinate system.Wherein it is possible to obtain t- Δ by IMU The angular speed of t moment, by carrying out time integral in [t- Δ t, t] to the angular speed, obtain robot t moment relative to The angle change of t- time Δt.By to the angle change iteration in each Δ t before t moment can obtain t moment relative to The angle of original position.For example, the angle is represented by (r with RPY representationt,pt,yt), it is illustrated respectively in t moment machine People is around X, Y, the angle of Z axis rotation.It can establish coordinate system X (t)-Y (t)-Z (t) of t moment according to the angle.
Acceleration (the imuax of t moment is obtained by IMUt,imuayt,imuazt) and t- time Δt acceleration, Time integral is carried out in [t- Δ t, t] to the acceleration of t- time Δt, obtains the speed (vx of t momentt,vyt,vzt).Gravity Projection of the acceleration under X (t)-Y (t)-Z (t) coordinate system on three axis is respectively (gxt,gyt,gzt), then the effective of t moment adds Speed is (axt,ayt,azt)=(imuaxt,imuayt,imuazt)-(gxt,gyt,gzt).Pass through formulaBring speed (vx intot,vyt,vzt) and effective acceleration (axt,ayt,azt), it can calculate in the interval Δ t Displacement increment.The displacement increment indicates the displacement increment under local coordinate system X (t)-Y (t)-Z (t), by space coordinate transformation, The displacement increment that can be converted under original coordinate system X-Y-Z.It will be between the coordinate in the home position and each sampling time It is added every displacement increment of the interior robot under X-Y-Z coordinate, can be obtained the initial estimation position of the robot It sets.
102: localized particle Filtering Model is obtained according to the initial estimated location and map.
Specifically, in some embodiments, obtaining localized particle Filtering Model, being obtained according to the initial estimated location The panel region in the map is obtained, which can be the certain area near the initial estimated location, then in the region Inside multiple particles are obtained according to certain distribution rule (such as be uniformly distributed, Gaussian Profile etc.) at random.Each particle is in institute Stating has position in map.
103: the obstacle information of the robot is obtained by the laser radar.
It is detected by barrier of the laser radar to the robot, obtains the barrier of ambient enviroment Information, wherein the obstacle information such as distance of obstacle distance robot and direction etc..
104: the robot being positioned using the localized particle Filtering Model and the obstacle information, is obtained Position of the robot in the map.
Each particle represents the possible position of a robot in particle filter model, it is assumed that robot moves 0.1 meter, 0.7 radian of rotation, then can use the state of each particle of system model predictions, obtain the prediction bits of each particle It sets.
The obstacle information that can be measured according to laser radar updates the weight of each particle, to guarantee to get over actual position The weight that close particle obtains is bigger.Specifically, if particle has same or similar barrier data in map, Then weighted value is high, and otherwise, then weighted value is low.For example, robot is measured at certain several direction and distance by laser radar to be had Barrier, then, if particle equally has barrier at same or similar direction and distance, then weighted value in map It is high.
Referring to figure 5., wherein grid indicates that grating map, triangle indicate that the pose of particle, circle indicate laser radar Scanning element, what white box indicated is idle grid, and the box of grey indicates the grating map occupied.At each particle Pose, the data of laser scanning are different with the matching degree for occupying map, by calculating this matching degree, can be counted Calculate the weighted value of particle.It can be seen from the figure that intermediate particle matching degree is relatively good, therefore, its weighted value is higher.
After the weight for updating each particle, resampling can be carried out, then according to the weight of the particle after resampling Value and predicted position calculate weighted average and obtain location mean value and variance.
Wherein, in some embodiments, it can use a localized particle Filtering Model to position robot.Another It is false sincere to reduce in some embodiments, robot can also be positioned using at least two particle filter models, and right Positioning result is merged, and position of the robot in map is obtained.
It include below first partial particle filter model and the second local particle filter model with localized particle Filtering Model For, illustrate how that the positioning result at least two localized particle Filtering Models merges.
If being positioned using first partial particle filter model to robot, obtains the first positioning mean value and first and determine Position variance positions robot using the second local particle filter model, obtains the second positioning mean value and the second positioning side Difference.Then if the first positioning variances and the second positioning variances are less than within a preset time or are equal to default variance threshold values (illustrating that first partial particle filter model and the second local particle filter model position success) and the first positioning mean value and the The difference of two positioning mean values is less than within a preset time or is equal to preset difference value threshold value, then, according to the first positioning mean value, first Positioning variances, the second positioning mean value and the second positioning variances obtain position of the robot in map.
Only work as two localized particle Filtering Models and all position success, and positioning result between the two is relatively, It just calculates and positions successfully, the positioning result of two localized particle Filtering Models is merged, obtains the position location of robot, such as This can reduce false sincere, preferably position to robot.
Determine specifically, the first positioning weight and second can be obtained respectively according to the first positioning variances and the second positioning variances Position weight, then according to first positioning weight and second positioning weight to the first positioning mean value and described second Positioning mean value is weighted and averaged, and obtains position of the robot in the map.
It, can be using the inverse of variance as weighted value in wherein some embodiments.Remember first partial particle filter model Positioning mean value and positioning variances be respectively m1=[x1,y1,theta1] andThe filter of second localized particle The positioning mean value and positioning variances of wave pattern are respectively m2=[x2,y2,theta2] and
Then fused result is m=[x, y, theta], wherein
In other embodiments, it is the robustness for guaranteeing algorithm, if local positioning is unsuccessful, robot is carried out complete Office's positioning.I.e. if first partial particle filter model or the second local particle filter model orientation unsuccessful (i.e. described first Positioning variances or second positioning variances are not up to less than or equal to default variance threshold values within a preset time, although alternatively, First positioning variances and second positioning variances be less than within a preset time or be equal to default variance threshold values but The difference of the first positioning mean value and the second positioning mean value is not up to less than or equal to preset difference value threshold within a preset time Value), then, third overall situation particle filter model is obtained according to the map, and utilize the third overall situation particle filter model pair The robot carries out Global localization, obtains position of the robot in the map.
The robot localization method, apparatus and robot based on laser radar of the embodiment of the present invention, described in obtaining The initial estimated location of robot obtains localized particle Filtering Model using the initial estimated location and map, and utilizes office Portion's particle filter model positions robot, since localized particle Filtering Model search range is searched for relative to Global localization Range is small, therefore locating speed is fast.
Fig. 6 is a kind of specific implementation for being illustrated robot localization method with the artificial sweeper of machine, in Fig. 6 institute In the embodiment shown, robot repositioning process includes three processing stages, specifically includes the first processing stage 100, second Processing stage 200, third processing stage 300, wherein the first processing stage 100 was the IMU data using sweeper to sweeper The pose for moving another place to from a place is estimated that second processing stage 200 is filtered using two localized particles Wave pattern carries out local positioning, and third processing stage 300 is to carry out Global localization using two global particle filter models.
In the embodiment shown in fig. 6, first with the initial estimated location of the IMU data estimation sweeper of sweeper, so Two localized particles Filtering Model pf1 and pf2 are obtained based on the initial estimated location and map afterwards, using pf1 to sweeper It carries out positioning acquisition and positions mean value m1 and positioning variances c1, positioning is carried out to sweeper using pf2 and obtains positioning mean value m2 and determines Position variance c2.If pf1 and pf2 positions success in preset time, i.e. c1 and c2 are respectively less than default variance threshold values thres_c, And the difference m1-m2 that the two positions mean value is less than preset difference value threshold value thres_m, then merges two localized particle Filtering Models Positioning result obtains position of the sweeper in map.
Otherwise, it recycles global particle filter model to carry out Global localization to sweeper, obtains two based on entire map Global particle filter model pf1 and pf2 carry out positioning to sweeper using pf1 and obtain positioning mean value m1 and positioning variances c1, benefit Positioning is carried out to sweeper with pf2 and obtains positioning mean value m2 and positioning variances c2.If pf1 and pf2 are positioned in preset time Success, and the difference m1-m2 of the two positioning mean value is less than preset difference value threshold value thres_m, then merges two global particle filters The positioning result of model obtains position of the sweeper in map.Otherwise, positioning failure.
Correspondingly, as shown in fig. 7, the embodiment of the invention also provides a kind of, the robot localization based on laser radar is filled It sets, can be used for robot 10 shown in fig. 1 or fig. 2, robotic positioning device 700 includes:
Initial estimated location obtains module 701, for obtaining the initial estimated location of the robot;
Localized particle Filtering Model obtains module 702, for obtaining local grain according to the initial estimated location and map Sub- Filtering Model;
Obstacle information obtains module 703, for obtaining the barrier of the robot by the laser radar Hinder object information;
Locating module 704, for utilizing the localized particle Filtering Model and the obstacle information to the robot It is positioned, obtains position of the robot in the map.
The robot localization method, apparatus and robot based on laser radar of the embodiment of the present invention, described in obtaining The initial estimated location of robot obtains localized particle Filtering Model using the initial estimated location and map, and utilizes office Portion's particle filter model positions robot, since localized particle Filtering Model search range is searched for relative to Global localization Range is small, therefore locating speed is fast.
In some embodiments, the localized particle Filtering Model includes first partial particle filter model and the second part Particle filter model;
The locating module 704 is specifically used for:
The robot is positioned using the first partial particle filter model and the obstacle information, is obtained First positioning result;
The robot is positioned using the described second local particle filter model and the obstacle information, is obtained Second positioning result;
First positioning result and second positioning result are merged, position of the robot in the map is obtained It sets.
In other embodiments, first positioning result includes the first positioning mean value and the first positioning variances, and described the Two positioning results include the second positioning mean value and the second positioning variances;
The locating module 704 is specifically used for:
If first positioning variances and second positioning variances are less than within a preset time or are equal to pre- If variance threshold values and it is described first positioning mean value and it is described second positioning mean value difference be less than or be equal within a preset time Preset difference value threshold value, then,
According to the first positioning mean value, the first positioning variances, the second positioning mean value and the acquisition of the second positioning variances Position of the robot in the map.
In further embodiments, locating module 704 also particularly useful for:
If first positioning variances or second positioning variances are not up to less than or equal to pre- within a preset time If variance threshold values, alternatively, first positioning variances and second positioning variances are less than or wait within a preset time The difference of the first positioning mean value and the second positioning mean value is not up to less than in default variance threshold values but within a preset time Or be equal to preset difference value threshold value, then,
Third overall situation particle filter model is obtained according to the map, and utilizes the third overall situation particle filter model pair The robot is positioned, and position of the robot in the map is obtained.
In wherein some embodiments, locating module 704 is specifically used for:
The first positioning weight and the second positioning are obtained respectively according to first positioning variances and second positioning variances Weight;
The first positioning mean value and described second are determined according to first positioning weight and second positioning weight Position mean value is weighted and averaged, and obtains position of the robot in the map.
In wherein some embodiments, initial estimated location obtains module 701 and is specifically used for:
Home position and the original coordinate system of the robot are obtained, the home position is that the robot is artificially moved Position before dynamic;
The time that the robot artificially moves is divided at least two sampling time intervals;
In any sampling time interval, the pose according to the robot establishes local coordinate system, obtains the machine Displacement increment of the people under the local coordinate system, and the displacement increment under the local coordinate system is converted into the original seat Displacement increment under mark system;
According to position of the robot under the original coordinate system in the home position and each sampling time interval The initial estimated location moved increment, obtain the robot.
In further embodiments, initial estimated location obtain module 701 also particularly useful for:
Obtain the angular speed and acceleration of the robot;
The pose for carrying out time integral in the sampling time interval to the angular speed, obtaining the robot;
The speed for carrying out time integral in the sampling time interval to the acceleration, obtaining the robot;
The robot is obtained under the local coordinate system, the sampling according to the velocity and acceleration of the robot Displacement increment in time interval.
In wherein some embodiments, locating module 704 is specifically used for:
The weight of each particle in the localized particle Filtering Model is updated according to the obstacle information;
Positioning mean value and positioning are obtained according to the position of each particle in the weight and the localized particle Filtering Model Variance.
It should be noted that method provided by the embodiment of the present application can be performed in above-mentioned apparatus, it is corresponding to have execution method Functional module and beneficial effect.The technical detail of detailed description not in Installation practice, reference can be made to the embodiment of the present application institute The method of offer.
Fig. 8 is the hardware structural diagram of controller 13 in one embodiment of robot 10, as shown in figure 8, controller 13 include:
One or more processors 131 and processor 132, in Fig. 8 by taking a processor 131 as an example.
Processor 131 can be connected with processor 132 by bus or other modes, to be connected by bus in Fig. 8 For.
Processor 132 is used as a kind of non-volatile computer readable storage medium storing program for executing, can be used for storing non-volatile software journey Sequence, non-volatile computer executable program and module, such as the corresponding journey of robot localization method in the embodiment of the present application Sequence instruction/module (for example, attached initial estimated location shown in Fig. 7 obtains module 701).Processor 131 is stored in by operation Non-volatile software program, instruction and module in processor 132, thereby executing the various function application and number of controller According to processing, that is, realize the robot localization method of above method embodiment.
Processor 132 may include storing program area and storage data area, wherein storing program area can store operation system Application program required for system, at least one function;Storage data area can be stored is created according to using for robotic positioning device The data etc. built.In addition, processor 132 may include high-speed random access memory, it can also include nonvolatile memory, A for example, at least disk memory, flush memory device or other non-volatile solid state memory parts.In some embodiments, Optional processor 132 includes the memory remotely located relative to processor 131, these remote memories can be connected by network It is connected to robot.The example of above-mentioned network include but is not limited to internet, intranet, local area network, mobile radio communication and its Combination.
One or more of modules are stored in the processor 132, when by one or more of processors When 131 execution, the robot localization method in above-mentioned any means embodiment is executed, for example, executing in Fig. 3 described above Method and step 101 is to step 104;Realize the function of the module 701-704 in Fig. 7.
Method provided by the embodiment of the present application can be performed in the said goods, has the corresponding functional module of execution method and has Beneficial effect.The not technical detail of detailed description in the present embodiment, reference can be made to method provided by the embodiment of the present application.
The embodiment of the present application provides a kind of non-volatile computer readable storage medium storing program for executing, the computer-readable storage medium Matter is stored with computer executable instructions, which is executed by one or more processors, such as in Fig. 8 It is fixed to may make that the robot in above-mentioned any means embodiment can be performed in said one or multiple processors for one processor 131 Position method, for example, executing method and step 101 in Fig. 3 described above to step 104;Realize the module 701-704 in Fig. 7 Function.
The apparatus embodiments described above are merely exemplary, wherein described, unit can as illustrated by the separation member It is physically separated with being or may not be, component shown as a unit may or may not be physics list Member, it can it is in one place, or may be distributed over multiple network units.It can be selected according to the actual needs In some or all of the modules achieve the purpose of the solution of this embodiment.
By the description of above embodiment, those of ordinary skill in the art can be understood that each embodiment can borrow Help software that the mode of general hardware platform is added to realize, naturally it is also possible to pass through hardware.Those of ordinary skill in the art can manage Solution realize above-described embodiment method in all or part of the process be can be instructed by computer program relevant hardware come It completes, the program can be stored in a computer-readable storage medium, and the program is when being executed, it may include such as above-mentioned each The process of the embodiment of method.Wherein, the storage medium can be magnetic disk, CD, read-only memory (Read-Only Memory, ROM) or random access memory (RandomAccessMemory, RAM) etc..
Finally, it should be noted that the above embodiments are merely illustrative of the technical solutions of the present invention, rather than its limitations;At this It under the thinking of invention, can also be combined between the technical characteristic in above embodiments or different embodiment, step can be with It is realized with random order, and there are many other variations of different aspect present invention as described above, for simplicity, they do not have Have and is provided in details;Although the present invention is described in detail referring to the foregoing embodiments, the ordinary skill people of this field Member is it is understood that it is still possible to modify the technical solutions described in the foregoing embodiments, or to part of skill Art feature is equivalently replaced;And these are modified or replaceed, each reality of the present invention that it does not separate the essence of the corresponding technical solution Apply the range of a technical solution.

Claims (12)

1. a kind of robot localization method based on laser radar is used for robot, the robot includes laser radar, special Sign is, which comprises
Obtain the initial estimated location of the robot;
Localized particle Filtering Model is obtained according to the initial estimated location and map;
The obstacle information of the robot is obtained by the laser radar;
The robot is positioned using the localized particle Filtering Model and the obstacle information, obtains the machine Position of the people in the map.
2. robot localization method according to claim 1, which is characterized in that the localized particle Filtering Model includes the One localized particle Filtering Model and the second local particle filter model;
It is described that the robot is positioned using the localized particle Filtering Model and the obstacle information, described in acquisition Position of the robot in the map, comprising:
The robot is positioned using the first partial particle filter model and the obstacle information, obtains first Positioning result;
The robot is positioned using the described second local particle filter model and the obstacle information, obtains second Positioning result;
First positioning result and second positioning result are merged, position of the robot in the map is obtained.
3. robot localization method according to claim 2, which is characterized in that first positioning result includes first fixed Position mean value and the first positioning variances, second positioning result include the second positioning mean value and the second positioning variances;
The fusion first positioning result and second positioning result, obtain position of the robot in the map It sets, comprising:
If first positioning variances and second positioning variances are less than within a preset time or are equal to default side Poor threshold value and the difference of the first positioning mean value and the second positioning mean value are less than within a preset time or are equal to default Difference threshold, then,
The machine is obtained according to the first positioning mean value, the first positioning variances, the second positioning mean value and the second positioning variances Position of the people in the map.
4. robot localization method according to claim 3, which is characterized in that the fusion first positioning result and Second positioning result obtains position of the robot in the map, further includes:
If first positioning variances or second positioning variances are not up to less than or equal to default side within a preset time Poor threshold value, alternatively, first positioning variances and second positioning variances are less than within a preset time or are equal in advance If variance threshold values but the within a preset time difference of the first positioning mean value and the second positioning mean value are not up to less than or wait In preset difference value threshold value, then,
Third overall situation particle filter model is obtained according to the map, and using the third overall situation particle filter model to described Robot is positioned, and position of the robot in the map is obtained.
5. robot localization method according to claim 3 or 4, which is characterized in that described equal according to first positioning Value, the first positioning variances, the second positioning mean value and the second positioning variances obtain position of the robot in the map, wrap It includes:
The first positioning weight and the second positioning weight are obtained respectively according to first positioning variances and second positioning variances;
It is equal to the first positioning mean value and second positioning according to first positioning weight and second positioning weight Value is weighted and averaged, and obtains position of the robot in the map.
6. robot localization method according to any one of claims 1-4, which is characterized in that described to obtain the machine The initial estimated location of people, comprising:
Home position and the original coordinate system of the robot are obtained, the home position is before the robot is artificially moved Position;
The time that the robot artificially moves is divided at least two sampling time intervals;
In any sampling time interval, the pose according to the robot establishes local coordinate system, obtains the robot and exists Displacement increment under the local coordinate system, and the displacement increment under the local coordinate system is converted into the original coordinate system Under displacement increment;
Increased according to displacement of the robot in the home position and each sampling time interval under the original coordinate system Measure, obtain the initial estimated location of the robot.
7. robot localization method according to claim 6, which is characterized in that described to obtain initially estimating for the robot Count position, further includes:
Obtain the angular speed and acceleration of the robot;
The pose for carrying out time integral in the sampling time interval to the angular speed, obtaining the robot;
The speed for carrying out time integral in the sampling time interval to the acceleration, obtaining the robot;
The robot is obtained under the local coordinate system, the sampling time according to the velocity and acceleration of the robot Displacement increment in interval.
8. robot localization method according to any one of claims 1-4, which is characterized in that described to utilize the part Particle filter model and the obstacle information position the robot, comprising:
The weight of each particle in the localized particle Filtering Model is updated according to the obstacle information;
Positioning mean value and positioning variances are obtained according to the position of each particle in the weight and the localized particle Filtering Model.
9. a kind of robotic positioning device based on laser radar is used for robot, the robot includes laser radar, special Sign is that described device includes:
Initial estimated location obtains module, for obtaining the initial estimated location of the robot;
Localized particle Filtering Model obtains module, filters mould for obtaining localized particle according to the initial estimated location and map Type;
Obstacle information obtains module, and the barrier for obtaining the robot by the laser radar is believed Breath;
Locating module, for being determined using the localized particle Filtering Model and the obstacle information the robot Position, obtains position of the robot in the map.
10. robotic positioning device according to claim 9, which is characterized in that the localized particle Filtering Model includes First partial particle filter model and the second local particle filter model;
The locating module is used for:
The robot is positioned using the first partial particle filter model and the obstacle information, obtains first Positioning result;
The robot is positioned using the described second local particle filter model and the obstacle information, obtains second Positioning result;
First positioning result and second positioning result are merged, position of the robot in the map is obtained.
11. a kind of robot, which is characterized in that the robot includes:
Robot body;
Walking mechanism on the robot body;
Laser radar and Inertial Measurement Unit on the robot body, the laser radar is for obtaining the machine The obstacle information of people's ambient enviroment;
Controller, the controller are built in the robot body, with the laser radar and the Inertial Measurement Unit Connection;
The controller includes:
At least one processor, and
Memory, the memory are connect at least one described processor communication, the memory be stored with can by it is described extremely The instruction that a few processor executes, described instruction are executed by least one described processor, so that at least one described processing Device is able to carry out the described in any item methods of claim 1-8.
12. a kind of non-volatile computer readable storage medium storing program for executing, which is characterized in that the computer-readable recording medium storage has Computer executable instructions make the robot execute such as right when the computer executable instructions are executed by robot It is required that the described in any item methods of 1-8.
CN201910731221.9A 2019-08-08 2019-08-08 Laser radar-based robot positioning method and device and robot Active CN110488818B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910731221.9A CN110488818B (en) 2019-08-08 2019-08-08 Laser radar-based robot positioning method and device and robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910731221.9A CN110488818B (en) 2019-08-08 2019-08-08 Laser radar-based robot positioning method and device and robot

Publications (2)

Publication Number Publication Date
CN110488818A true CN110488818A (en) 2019-11-22
CN110488818B CN110488818B (en) 2020-07-17

Family

ID=68550337

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910731221.9A Active CN110488818B (en) 2019-08-08 2019-08-08 Laser radar-based robot positioning method and device and robot

Country Status (1)

Country Link
CN (1) CN110488818B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111174782A (en) * 2019-12-31 2020-05-19 智车优行科技(上海)有限公司 Pose estimation method and device, electronic equipment and computer readable storage medium
CN111267103A (en) * 2020-03-09 2020-06-12 深圳拓邦股份有限公司 Method and device for acquiring initial position of robot, robot and storage medium
CN111352066A (en) * 2020-03-27 2020-06-30 西安震有信通科技有限公司 Particle filter-based positioning method and device, computer equipment and storage medium
CN111474535A (en) * 2020-03-18 2020-07-31 广东省智能机器人研究院 Mobile robot global positioning method based on characteristic thermodynamic diagram
CN111947649A (en) * 2020-06-21 2020-11-17 珠海市一微半导体有限公司 Robot positioning method based on data fusion, chip and robot
CN113558524A (en) * 2021-07-14 2021-10-29 北京小狗吸尘器集团股份有限公司 Sweeping robot and method and device for repositioning lifted sweeping robot
CN116242410A (en) * 2022-09-05 2023-06-09 浙江智马达智能科技有限公司 Calibration method, terminal and computer storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130116819A1 (en) * 2011-11-09 2013-05-09 Fujitsu Limited Estimating apparatus, estimating method, and computer product
CN106066179A (en) * 2016-07-27 2016-11-02 湖南晖龙股份有限公司 A kind of robot location based on ROS operating system loses method for retrieving and control system
CN107991683A (en) * 2017-11-08 2018-05-04 华中科技大学 A kind of robot autonomous localization method based on laser radar
CN108507579A (en) * 2018-04-08 2018-09-07 浙江大承机器人科技有限公司 A kind of method for relocating based on localized particle filtering
CN108632761A (en) * 2018-04-20 2018-10-09 西安交通大学 A kind of indoor orientation method based on particle filter algorithm
CN108931245A (en) * 2018-08-02 2018-12-04 上海思岚科技有限公司 The local method for self-locating and equipment of mobile robot
CN109725329A (en) * 2019-02-20 2019-05-07 苏州风图智能科技有限公司 A kind of unmanned vehicle localization method and device
CN109870716A (en) * 2017-12-01 2019-06-11 北京京东尚科信息技术有限公司 Localization method and positioning device and computer readable storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130116819A1 (en) * 2011-11-09 2013-05-09 Fujitsu Limited Estimating apparatus, estimating method, and computer product
CN106066179A (en) * 2016-07-27 2016-11-02 湖南晖龙股份有限公司 A kind of robot location based on ROS operating system loses method for retrieving and control system
CN107991683A (en) * 2017-11-08 2018-05-04 华中科技大学 A kind of robot autonomous localization method based on laser radar
CN109870716A (en) * 2017-12-01 2019-06-11 北京京东尚科信息技术有限公司 Localization method and positioning device and computer readable storage medium
CN108507579A (en) * 2018-04-08 2018-09-07 浙江大承机器人科技有限公司 A kind of method for relocating based on localized particle filtering
CN108632761A (en) * 2018-04-20 2018-10-09 西安交通大学 A kind of indoor orientation method based on particle filter algorithm
CN108931245A (en) * 2018-08-02 2018-12-04 上海思岚科技有限公司 The local method for self-locating and equipment of mobile robot
CN109725329A (en) * 2019-02-20 2019-05-07 苏州风图智能科技有限公司 A kind of unmanned vehicle localization method and device

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111174782A (en) * 2019-12-31 2020-05-19 智车优行科技(上海)有限公司 Pose estimation method and device, electronic equipment and computer readable storage medium
CN111174782B (en) * 2019-12-31 2021-09-17 智车优行科技(上海)有限公司 Pose estimation method and device, electronic equipment and computer readable storage medium
CN111267103A (en) * 2020-03-09 2020-06-12 深圳拓邦股份有限公司 Method and device for acquiring initial position of robot, robot and storage medium
CN111474535A (en) * 2020-03-18 2020-07-31 广东省智能机器人研究院 Mobile robot global positioning method based on characteristic thermodynamic diagram
CN111352066A (en) * 2020-03-27 2020-06-30 西安震有信通科技有限公司 Particle filter-based positioning method and device, computer equipment and storage medium
CN111352066B (en) * 2020-03-27 2022-02-22 西安震有信通科技有限公司 Particle filter-based positioning method and device, computer equipment and storage medium
CN111947649A (en) * 2020-06-21 2020-11-17 珠海市一微半导体有限公司 Robot positioning method based on data fusion, chip and robot
CN113558524A (en) * 2021-07-14 2021-10-29 北京小狗吸尘器集团股份有限公司 Sweeping robot and method and device for repositioning lifted sweeping robot
CN116242410A (en) * 2022-09-05 2023-06-09 浙江智马达智能科技有限公司 Calibration method, terminal and computer storage medium
CN116242410B (en) * 2022-09-05 2023-12-19 浙江智马达智能科技有限公司 Calibration method, terminal and computer storage medium

Also Published As

Publication number Publication date
CN110488818B (en) 2020-07-17

Similar Documents

Publication Publication Date Title
CN110488818A (en) A kind of robot localization method, apparatus and robot based on laser radar
CN109084732B (en) Positioning and navigation method, device and processing equipment
Mansouri et al. Cooperative coverage path planning for visual inspection
CN110850873B (en) Unmanned ship path planning method, device, equipment and storage medium
KR102292277B1 (en) LIDAR localization inferring solutions using 3D CNN networks in autonomous vehicles
CN110312912B (en) Automatic vehicle parking system and method
EP3583380B1 (en) Shape estimating apparatus
Scherer et al. River mapping from a flying robot: state estimation, river detection, and obstacle mapping
KR102350181B1 (en) LIDAR Position Estimation Using RNN and LSTM to Perform Temporal Smoothing in Autonomous Vehicles
KR101725060B1 (en) Apparatus for recognizing location mobile robot using key point based on gradient and method thereof
CN104236548B (en) Autonomous navigation method in a kind of MAV room
JP5987823B2 (en) Method and system for fusing data originating from image sensors and motion or position sensors
CN110118560B (en) Indoor positioning method based on LSTM and multi-sensor fusion
CN114474061B (en) Cloud service-based multi-sensor fusion positioning navigation system and method for robot
JP6855524B2 (en) Unsupervised learning of metric representations from slow features
CN108646761A (en) Robot indoor environment exploration, avoidance and method for tracking target based on ROS
US20130116823A1 (en) Mobile apparatus and walking robot
CN111145251A (en) Robot, synchronous positioning and mapping method thereof and computer storage device
CN114387462A (en) Dynamic environment sensing method based on binocular camera
CN113433937A (en) Heuristic exploration-based layered navigation obstacle avoidance system and layered navigation obstacle avoidance method
Yang et al. Vision-based localization and mapping for an autonomous mower
WO2022091305A1 (en) Behavior estimation device, behavior estimation method, route generation device, route generation method, and computer-readable recording medium
De Silva et al. Comparative analysis of octomap and rtabmap for multi-robot disaster site mapping
CN116679710A (en) Robot obstacle avoidance strategy training and deployment method based on multitask learning
CN114429432B (en) Multi-source information layered fusion method and device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address

Address after: 518000 1701, building 2, Yinxing Zhijie, No. 1301-72, sightseeing Road, Xinlan community, Guanlan street, Longhua District, Shenzhen, Guangdong Province

Patentee after: Shenzhen Yinxing Intelligent Group Co.,Ltd.

Address before: 518000 building A1, Yinxing hi tech Industrial Park, Guanlan street, Longhua District, Shenzhen City, Guangdong Province

Patentee before: Shenzhen Silver Star Intelligent Technology Co.,Ltd.

CP03 Change of name, title or address