CN113432533A - Robot positioning method and device, robot and storage medium - Google Patents

Robot positioning method and device, robot and storage medium Download PDF

Info

Publication number
CN113432533A
CN113432533A CN202110750179.2A CN202110750179A CN113432533A CN 113432533 A CN113432533 A CN 113432533A CN 202110750179 A CN202110750179 A CN 202110750179A CN 113432533 A CN113432533 A CN 113432533A
Authority
CN
China
Prior art keywords
current
robot
laser
matching
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110750179.2A
Other languages
Chinese (zh)
Other versions
CN113432533B (en
Inventor
刘嗣超
闫东坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Yingdi Mande Technology Co ltd
Original Assignee
Beijing Yingdi Mande Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Yingdi Mande Technology Co ltd filed Critical Beijing Yingdi Mande Technology Co ltd
Publication of CN113432533A publication Critical patent/CN113432533A/en
Application granted granted Critical
Publication of CN113432533B publication Critical patent/CN113432533B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Remote Sensing (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application provides a robot positioning method, a device, a robot and a storage medium, wherein the method comprises the following steps: acquiring a current laser point cloud and a local probability map corresponding to the current operating environment of the robot; judging whether laser degradation exists currently according to the matching condition between the current laser point cloud and the local probability map; when laser degradation exists currently, determining a laser degradation direction according to the position information of a matching point between the current laser point cloud and the local probability map; acquiring a current predicted position of the robot; and determining the current pose information of the robot according to the current predicted position, the laser degradation direction, the position information of each matching point and the angle information of each matching point. According to the method provided by the scheme, the laser detection information and the predicted position information are fused according to the laser degradation condition, so that the pose information of the robot is determined, and the positioning accuracy of the robot is improved.

Description

Robot positioning method and device, robot and storage medium
Technical Field
The present disclosure relates to the field of artificial intelligence technologies, and in particular, to a method and an apparatus for positioning a robot, and a storage medium.
Background
Laser slam is currently one of the most widely used robotic positioning techniques. In laser slam, there are some challenging scenarios that can degrade positioning accuracy, such as corridors, single-sided walls, circular boundaries, etc.
The existing processing can only deal with a certain scene, such as a long corridor, and a method exists, wherein a point cloud extraction front end is utilized to judge that the current robot is the long corridor, and then the predicted current robot position is fused with the course acquired after the laser radar scanning matching, so that the fused robot pose is obtained.
However, the existing processing for laser degradation is limited to a certain predetermined scene, and the characteristics of the scene are taken as the processing basis, so that the adaptability is poor, and the positioning accuracy of the robot is low.
Disclosure of Invention
The application provides a robot positioning method, a robot positioning device, a robot and a storage medium, which aim to overcome the defects of low positioning precision and the like in the prior art.
A first aspect of the present application provides a robot positioning method, including:
acquiring a current laser point cloud and a local probability map corresponding to the current operating environment of the robot;
judging whether laser degradation exists currently according to the matching condition between the current laser point cloud and the local probability map;
when laser degradation exists currently, determining a laser degradation direction according to the position information of a matching point between the current laser point cloud and the local probability map;
acquiring a current predicted position of the robot;
and determining the current pose information of the robot according to the current predicted position, the laser degradation direction, the position information of each matching point and the angle information of each matching point.
Optionally, the determining the current pose information of the robot according to the current predicted position, the laser degradation direction, the position information of each matching point, and the angle information of each matching point includes:
determining the laser degradation direction as a first direction, and determining a direction perpendicular to the laser degradation direction as a second direction;
determining a first direction coordinate corresponding to the current predicted position as a current first direction coordinate of the robot;
determining the current second direction coordinate of the robot according to the position information of each matching point;
determining the current orientation of the robot according to the angle information of each matching point;
and determining the current pose information according to the current first direction coordinate, the current second direction coordinate and the current orientation of the robot.
Optionally, the determining whether laser degradation exists currently according to the matching condition between the current laser point cloud and the local probability map includes:
calculating a matching value between the current laser point cloud and the local probability map;
when the matching value is larger than a preset matching value threshold value, determining an acquisition point corresponding to the current laser point cloud as a matching point;
and when the number of the matching points exceeds a first preset threshold value, determining that laser degradation exists currently.
Optionally, the method further includes:
when the number of the matching points is larger than a second preset threshold and smaller than a first preset threshold, counting the number of the angle matching points corresponding to the current acquisition point;
and when the number of the angle matching points exceeds a third preset threshold value, determining that laser angle degradation exists currently.
Optionally, the method further includes:
acquiring a current predicted orientation of the robot;
determining the current position information of the robot according to the position information of the matching points;
and determining the current pose information of the robot according to the current predicted orientation and the current position information of the robot.
Optionally, the calculating a matching value between the current local probability map and the historical local probability map includes:
the match value Score is determined according to the following formula:
Score=(P(x1,y1)+...+P(xi,yi)+...+P(xn,yn))/n
wherein (x)i,yi) Representing the coordinates of the ith point cloud in the grid, P (x)i,yi) And (3) representing the grid occupation probability of the ith point cloud, wherein n is the number of the scanning point clouds.
Optionally, the obtaining of the current laser point cloud and the local probability map corresponding to the current operating environment of the robot includes:
acquiring historical point cloud and radar signals of the robot in the current operating environment;
constructing a current laser point cloud according to the radar signal;
and constructing the local probability map according to the historical point cloud.
A second aspect of the present application provides a robot positioning device, comprising:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a current laser point cloud and a local probability map corresponding to the current operating environment of the robot;
the judging module is used for judging whether laser degradation exists currently according to the matching condition between the current laser point cloud and the local probability map;
the determining module is used for determining the laser degradation direction according to the position information of the matching point between the current laser point cloud and the local probability map when the laser degradation exists currently;
the second acquisition module is used for acquiring the current predicted position of the robot;
and the positioning module is used for determining the current pose information of the robot according to the current predicted position, the laser degradation direction, the position information of each matching point and the angle information of each matching point.
A third aspect of the present application provides a robot comprising: the system comprises a laser radar, a predicted pose sensor, at least one processor and a memory;
the laser radar carries out laser detection;
the prediction pose sensor is used for acquiring a current prediction position and a current prediction orientation of the robot;
the memory stores computer-executable instructions;
the at least one processor executes computer-executable instructions stored by the memory to cause the at least one processor to perform the method as set forth in the first aspect above and in various possible designs of the first aspect.
A fourth aspect of the present application provides a computer-readable storage medium having stored thereon computer-executable instructions that, when executed by a processor, implement a method as set forth in the first aspect and various possible designs of the first aspect.
This application technical scheme has following advantage:
according to the robot positioning method, the robot positioning device, the robot and the storage medium, the current laser point cloud and the local probability map corresponding to the current operation environment of the robot are obtained; judging whether laser degradation exists currently according to the matching condition between the current laser point cloud and the local probability map; when laser degradation exists currently, determining a laser degradation direction according to the position information of a matching point between the current laser point cloud and the local probability map; acquiring a current predicted position of the robot; and determining the current pose information of the robot according to the current predicted position, the laser degradation direction, the position information of each matching point and the angle information of each matching point. According to the method provided by the scheme, the laser detection information and the predicted position information are fused according to the laser degradation condition, so that the pose information of the robot is determined, and the positioning accuracy of the robot is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings can be obtained by those skilled in the art according to these drawings.
Fig. 1 is a schematic structural diagram of a robot positioning system based on an embodiment of the present application;
fig. 2 is a schematic flowchart of a robot positioning method according to an embodiment of the present disclosure;
fig. 3 is a diagram of an exemplary robot operation scenario provided in an embodiment of the present application;
fig. 4 is a diagram of another exemplary robot operation scenario provided in an embodiment of the present application;
fig. 5 is a schematic structural diagram of a robot positioning device according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of a robot according to an embodiment of the present application.
With the above figures, there are shown specific embodiments of the present application, which will be described in more detail below. These drawings and written description are not intended to limit the scope of the disclosed concepts in any way, but rather to illustrate the concepts of the disclosure to those skilled in the art by reference to specific embodiments.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Furthermore, the terms "first", "second", etc. are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. In the description of the following examples, "plurality" means two or more unless specifically limited otherwise.
The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments. Embodiments of the present invention will be described below with reference to the accompanying drawings.
First, a configuration of a robot positioning system based on the present application will be described:
the robot positioning method and device, the robot and the storage medium are suitable for detecting the current pose information of the robot. As shown in fig. 1, the structural schematic diagram of the robot positioning system according to the embodiment of the present disclosure mainly includes a data acquisition device and a robot positioning device for detecting current pose information of a robot, where both the data acquisition device and the robot positioning device may be disposed inside the robot, and both may be disposed on other electronic devices if conditions allow. Specifically, a data acquisition device acquires the current laser point cloud and the local probability map of the robot, and transmits the acquired data to a robot positioning device so that the robot positioning device can determine the current pose information of the current robot.
The embodiment of the application provides a robot positioning method for determining robot pose information.
As shown in fig. 2, a schematic flowchart of a robot positioning method provided in an embodiment of the present application is shown, where the method includes:
step 201, obtaining a current laser point cloud and a local probability map corresponding to a current operation environment of the robot.
It should be noted that, a laser radar is installed on the robot for detecting the surrounding environment, and the current laser point cloud may be generated according to a radar signal of the laser radar. The local probability map is built step by step in the process of continuously generating the laser point cloud, and particularly reflects the surrounding environment of the robot in the whole motion process, namely the expression form of the historical laser point cloud of the robot before the current moment.
Specifically, in one embodiment, a historical point cloud and a radar signal in the current operating environment of the robot can be obtained; constructing a current laser point cloud according to the radar signal; and constructing a local probability map according to the historical point cloud.
For example, when a laser beam is irradiated onto the surface of an object, the reflected laser beam (radar signal) carries information such as azimuth and distance. When the laser beam is scanned along a certain trajectory, the reflected laser point information is recorded while scanning, and since the laser radar scanning is extremely fine, a large number of laser points can be obtained, and thus a laser point cloud can be formed. And with the continuous formation of the laser point cloud, a corresponding local probability map can be constructed.
Step 202, judging whether laser degradation exists currently according to the matching condition between the current laser point cloud and the local probability map.
It should be noted that, when the robot travels in a scene such as a corridor, since the laser point clouds obtained each time are almost the same in the process of the robot passing through the corridor, it is difficult for the robot to determine the current position and orientation point according to the radar signal, and it can be determined that laser degradation exists at this time.
It should be further noted that, in the prior art, whether laser degradation occurs is generally predicted according to detected scene information (such as currently crossing a corridor), and the reliability of the detection result of the scene information is difficult to guarantee, so that the accuracy of the laser degradation determination result in the prior art is low. However, according to the embodiment of the application, whether laser degradation occurs is judged according to the specific feedback of the radar signal and the matching condition of the local probability map, and the accuracy of the obtained judgment result is higher.
And step 203, when laser degradation exists currently, determining a laser degradation direction according to the position information of the matching points between the current laser point cloud and the local probability map.
The matching point is a certain pose point of the robot, and the pose point information mainly comprises position information and orientation information of the robot.
Specifically, the laser degradation of the robot at which pose points exists can be determined according to the position information of the matched pose points, and the laser degradation direction is determined according to the position information of each matched point. The above laser degradation directions include: the direction of degradation of the position and/or the direction of degradation of the angle. For example, in the process of passing through the gallery, the laser degradation direction is the degradation direction of the position, and the laser degradation direction is the direction of the gallery.
Specifically, the laser degradation direction may be determined according to the position information of each matching point by using a least square straight line fitting method.
And step 204, acquiring the current predicted position of the robot.
It should be noted that other predicted pose sensors may also be deployed on the robot, and specifically, the predicted pose sensors may be odometers, Inertial measurement units (IMU for short), and the like.
Specifically, in the moving process of the robot, the prediction pose sensor can acquire state information of the robot in real time, and then the position of the robot is predicted.
And step 205, determining the current pose information of the robot according to the current predicted position, the laser degradation direction, the position information of each matching point and the angle information of each matching point.
The pose information of the robot includes position coordinates and orientation (angle information) of the robot.
Specifically, after the laser degradation direction is determined, it may be determined that the position coordinates of the laser degradation direction are inaccurate, and at this time, the position coordinates of the robot in the laser degradation direction may be determined according to the current predicted position of the robot. The position coordinates of the non-degraded direction can still be determined according to the position information of the matching point, and the orientation of the robot can also be determined according to the angle information of the matching point. According to the embodiment of the application, the position information of the laser radar in the non-degradation direction and the predicted position information acquired by the other predicted pose sensor are combined, the positioning interference caused by laser degradation is eliminated, and the positioning accuracy of the robot is improved.
Specifically, in one embodiment, the laser degradation direction may be determined as a first direction, and a direction perpendicular to the laser degradation direction may be determined as a second direction; determining a first direction coordinate corresponding to the current predicted position as a current first direction coordinate of the robot; determining the current second direction coordinate of the robot according to the position information of each matching point; determining the current orientation of the robot according to the angle information of each matching point; and determining current pose information according to the current first direction coordinate, the current second direction coordinate and the current orientation of the robot.
For example, if the position coordinates of the robot are (x, y), the first direction may be an x-axis direction, and the second direction may be a y-axis direction.
Specifically, when the robot passes through the corridor and the laser degradation direction is the path direction (corridor direction) passing through the corridor, the first direction (e.g., x-axis direction) is the corridor direction and the second direction (e.g., y-axis direction) is the width direction of the corridor. Specifically, the y-axis coordinate of the robot, that is, y1, may be determined from the current predicted position (x1, y1), and the x-axis direction of the robot, that is, x2, may be determined from the position information (x2, y2) reflected by the matching point, and the position coordinate of the robot may be (x2, y 1). And finally, determining the current orientation of the robot according to the angle information reflected by the matching points, and determining the current pose information of the robot by combining the position coordinates and the current orientation of the robot.
For example, as shown in fig. 3, according to an exemplary robot operation scene diagram provided by the embodiment of the present application, in a long corridor scene, the robot may obtain a plurality of solutions shown by dotted lines, where the laser exhibits degradation in the corridor direction, that is, the laser is matched with a map. The embodiment of the application calculates the fitting directions of the solutions, namely the corridor direction, uses the positions predicted by other sensors (the predicted pose sensors) in the direction, uses the laser matching result of the angle and the position vertical to the corridor direction, utilizes the trustable information of the laser to the maximum extent, and improves the positioning precision of the robot. In addition, the same is true only for the case of a single-sided wall.
On the basis of the foregoing embodiment, in order to further improve the reliability of the laser degradation determination result, as an implementable manner, in an embodiment, determining whether there is laser degradation currently according to a matching condition between the current laser point cloud and the local probability map includes:
step 2021, calculating a matching value between the current laser point cloud and the local probability map;
step 2022, when the matching value is greater than a preset matching value threshold, determining the acquisition point corresponding to the current laser point cloud as a matching point;
step 2023, when the number of the matching points exceeds the first preset threshold, determining that laser degradation currently exists.
Specifically, each time the robot runs to a position, a current laser point cloud is generated, then a matching value (similarity) between the current laser point cloud and the local probability map is calculated, when the matching value is larger than a threshold value of the matching value, the current laser point cloud can be determined to be matched with the local probability map, and then a current pose point of the robot is determined to be a matching point. Through repeated matching point detection operations, the number of matching points may be increased continuously, and when it reaches a first preset threshold, it may be determined that laser degradation is currently present.
Specifically, in an embodiment, in order to guarantee the reliability of the matching value calculation result, the matching value Score may be determined according to the following formula:
Score=(P(x1,y1)+...+P(xi,yi)+...+P(xn,yn))/n
wherein (x)i,yi) Representing the coordinates of the ith point cloud in the grid, P (x)i,yi) Representing the lattice occupation probability of the ith point cloud, n being the number of scanning point clouds
Wherein (x)i,yi) Representing the coordinates of the ith point cloud in the grid, P (x)i,yi) And (3) representing the grid occupation probability of the ith point cloud, wherein n is the number of the scanning point clouds.
On the basis of the above embodiment, since laser degradation is not only reflected in the aspect of position coordinate detection, in reality, a situation of laser angle degradation may also occur, and at this time, the current orientation of the robot cannot be determined according to radar signals, which also affects the positioning accuracy of the robot.
Therefore, for the above problem, when the number of the matching points is greater than the second preset threshold and smaller than the first preset threshold, the number of the angle matching points corresponding to the current acquisition point may be counted; and when the number of the angle matching points exceeds a third preset threshold value, determining that laser angle degradation exists currently. Otherwise, it is determined that there is no laser degradation present.
Specifically, when the number of the matching points is greater than the second preset threshold and smaller than the first preset threshold, it can be determined that the positioning condition is good in the operating scene and there is no laser position degradation. In order to further judge whether laser angle degradation exists, the number of angle matching points corresponding to the current acquisition point is counted, namely under the condition that the robot does not move at the current acquisition point, the laser point clouds in which directions are matched are counted, and the direction of the matched robot is determined as the angle matching point. When the number of the angle matching points exceeds a third preset threshold value, it is determined that laser angle degradation exists at present, and the robot orientation information fed back by the laser radar is not credible.
The first preset threshold provided in the embodiment of the present application is greater than the second preset threshold, the magnitude relationship between the third preset threshold and the other two thresholds may be set according to an actual situation, and the specific value of each threshold may also be set according to the actual situation, which is not limited in the embodiment of the present application.
Further, when the laser degradation direction is an angular degradation direction, that is, when it is determined that there is currently laser angle degradation, the following method may be further included: acquiring a current predicted orientation of the robot; determining the current position information of the robot according to the position information of the matching points; and determining the current pose information of the robot according to the current predicted orientation and the current position information of the robot.
Further, in order to compensate for the obstacle in the detection of the orientation caused by the degradation of the laser angle, the current orientation of the robot may be determined according to the current predicted orientation determined by a prediction pose sensor (e.g., another prediction pose sensor such as an IMU) on the robot. Wherein, the position coordinates (current position information) of the robot can still be determined according to the position information of the matching points.
For example, as shown in fig. 4, in another exemplary robot operation scenario diagram provided in the embodiment of the present application, the robot is in a circle (e.g., a circular boundary scenario), the position obtained by laser matching is stable, but there are any multiple solutions to the angle (orientation), and then the positioning accuracy is ensured by using the angle predicted by other sensors and the position of laser matching.
Similarly, when the laser degradation direction includes: the degradation direction of the angle and the degradation direction of the position, namely, when the current laser position degradation and laser angle degradation are determined, the predicted position values and the predicted angle values of other sensors can be completely used until the laser data is recovered to be credible, and then the laser data is selected to carry out robot positioning.
In an embodiment, when the robot runs in an open scene, the laser of the robot cannot scan obstacles, the matching score of the laser and the map is lower than a threshold value, the predicted values of other sensors can be completely used at the moment, and the laser data is selected to perform robot positioning until the laser data is recovered to be credible.
The embodiment of the application provides a robot positioning method, which comprises the steps of obtaining a current laser point cloud and a local probability map corresponding to the current operation environment of a robot; judging whether laser degradation exists currently according to the matching condition between the current laser point cloud and the local probability map; when laser degradation exists currently, determining a laser degradation direction according to position information of matching points between the current laser point cloud and the local probability map; acquiring a current predicted position of the robot; and determining the current pose information of the robot according to the current predicted position, the laser degradation direction, the position information of each matching point and the angle information of each matching point. According to the method provided by the scheme, the position and attitude information of the robot is determined by fusing the laser detection information and the predicted position information according to the laser degradation condition, so that the positioning accuracy of the robot is improved. And moreover, the reliability of the laser degradation judgment result is improved, so that a foundation is laid for further improving the positioning precision of the robot.
The embodiment of the application provides a robot positioning device, which is used for executing the robot positioning method provided by the embodiment.
Fig. 5 is a schematic structural diagram of a robot positioning device according to an embodiment of the present disclosure. The apparatus 50 includes a first obtaining module 501, a determining module 502, a determining module 503, a second obtaining module 504, and a positioning module 505.
The first obtaining module 501 is configured to obtain a current laser point cloud and a local probability map corresponding to a current operating environment of the robot; a judging module 502, configured to judge whether laser degradation exists currently according to a matching condition between a current laser point cloud and a local probability map; a determining module 503, configured to determine a laser degradation direction according to position information of a matching point between a current laser point cloud and a local probability map when laser degradation exists currently; a second obtaining module 504, configured to obtain a current predicted position of the robot; and the positioning module 505 is configured to determine current pose information of the robot according to the current predicted position, the laser degradation direction, the position information of each matching point, and the angle information of each matching point.
With regard to the robot positioning device in the present embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
The robot positioning device provided by the embodiment of the application is used for executing the robot positioning method provided by the embodiment, and the implementation manner and the principle of the robot positioning device are the same and are not repeated.
The embodiment of the application provides a robot, which is used for executing the robot positioning method provided by the embodiment.
Fig. 6 is a schematic structural diagram of a robot according to an embodiment of the present disclosure. The robot 60 includes: at least one processor 61, memory 62, lidar 63, and a predicted pose sensor 64;
the laser radar is used for laser detection; the prediction pose sensor is used for acquiring the current prediction position and the current prediction orientation of the robot; the memory stores computer-executable instructions; the at least one processor executes computer-executable instructions stored by the memory to cause the at least one processor to perform the robot positioning method as provided by the above embodiments.
The robot provided by the embodiment of the application is used for executing the robot positioning method provided by the embodiment, and the implementation manner and the principle of the robot positioning method are the same and are not repeated.
The embodiment of the present application provides a computer-readable storage medium, where a computer executing instruction is stored in the computer-readable storage medium, and when a processor executes the computer executing instruction, the robot positioning method provided in any of the above embodiments is implemented.
The storage medium containing the computer-executable instructions of the embodiment of the present application may be used to store the computer-executable instructions of the robot positioning method provided in the foregoing embodiment, and the implementation manner and the principle thereof are the same and are not described again.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute some steps of the methods according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
It is obvious to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to perform all or part of the above described functions. For the specific working process of the device described above, reference may be made to the corresponding process in the foregoing method embodiment, which is not described herein again.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (10)

1. A robot positioning method, comprising:
acquiring a current laser point cloud and a local probability map corresponding to the current operating environment of the robot;
judging whether laser degradation exists currently according to the matching condition between the current laser point cloud and the local probability map;
when laser degradation exists currently, determining a laser degradation direction according to the position information of a matching point between the current laser point cloud and the local probability map;
acquiring a current predicted position of the robot;
and determining the current pose information of the robot according to the current predicted position, the laser degradation direction, the position information of each matching point and the angle information of each matching point.
2. The method according to claim 1, wherein determining the current pose information of the robot according to the current predicted position, the laser degradation direction, the position information of each matching point and the angle information of each matching point comprises:
determining the laser degradation direction as a first direction, and determining a direction perpendicular to the laser degradation direction as a second direction;
determining a first direction coordinate corresponding to the current predicted position as a current first direction coordinate of the robot;
determining the current second direction coordinate of the robot according to the position information of each matching point;
determining the current orientation of the robot according to the angle information of each matching point;
and determining the current pose information according to the current first direction coordinate, the current second direction coordinate and the current orientation of the robot.
3. The method of claim 1, wherein the determining whether laser degradation is currently present according to the matching between the current laser point cloud and the local probability map comprises:
calculating a matching value between the current laser point cloud and the local probability map;
when the matching value is larger than a preset matching value threshold value, determining an acquisition point corresponding to the current laser point cloud as a matching point;
and when the number of the matching points exceeds a first preset threshold value, determining that laser degradation exists currently.
4. The method of claim 3, further comprising:
when the number of the matching points is larger than a second preset threshold and smaller than a first preset threshold, counting the number of the angle matching points corresponding to the current acquisition point;
and when the number of the angle matching points exceeds a third preset threshold value, determining that laser angle degradation exists currently.
5. The method of claim 1, further comprising:
acquiring a current predicted orientation of the robot;
determining the current position information of the robot according to the position information of the matching points;
and determining the current pose information of the robot according to the current predicted orientation and the current position information of the robot.
6. The method of claim 3, wherein calculating a match value between the current local probability map and the historical local probability map comprises:
the match value Score is determined according to the following formula:
Score=(P(x1,y1)+...+P(xi,yi)+...+P(xn,yn))/n
wherein (x)i,yi) Representing the coordinates of the ith point cloud in the grid, P (x)i,yi) And (3) representing the grid occupation probability of the ith point cloud, wherein n is the number of the scanning point clouds.
7. The method of claim 1, wherein obtaining the current laser point cloud and the local probability map corresponding to the current operating environment of the robot comprises:
acquiring historical point cloud and radar signals of the robot in the current operating environment;
constructing a current laser point cloud according to the radar signal;
and constructing the local probability map according to the historical point cloud.
8. A robot positioning device, comprising:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a current laser point cloud and a local probability map corresponding to the current operating environment of the robot;
the judging module is used for judging whether laser degradation exists currently according to the matching condition between the current laser point cloud and the local probability map;
the determining module is used for determining the laser degradation direction according to the position information of the matching point between the current laser point cloud and the local probability map when the laser degradation exists currently;
the second acquisition module is used for acquiring the current predicted position of the robot;
and the positioning module is used for determining the current pose information of the robot according to the current predicted position, the laser degradation direction, the position information of each matching point and the angle information of each matching point.
9. A robot, comprising: the system comprises a laser radar, a predicted pose sensor, at least one processor and a memory;
the laser radar carries out laser detection;
the prediction pose sensor is used for acquiring a current prediction position and a current prediction orientation of the robot;
the memory stores computer-executable instructions;
the at least one processor executing the computer-executable instructions stored by the memory causes the at least one processor to perform the method of any one of claims 1-7.
10. A computer-readable storage medium having computer-executable instructions stored thereon which, when executed by a processor, implement the method of any one of claims 1 to 7.
CN202110750179.2A 2021-06-18 2021-07-02 Robot positioning method and device, robot and storage medium Active CN113432533B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110679029 2021-06-18
CN2021106790297 2021-06-18

Publications (2)

Publication Number Publication Date
CN113432533A true CN113432533A (en) 2021-09-24
CN113432533B CN113432533B (en) 2023-08-15

Family

ID=77758885

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110750179.2A Active CN113432533B (en) 2021-06-18 2021-07-02 Robot positioning method and device, robot and storage medium

Country Status (1)

Country Link
CN (1) CN113432533B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113978512A (en) * 2021-11-03 2022-01-28 北京埃福瑞科技有限公司 Rail train positioning method and device
CN114199233A (en) * 2021-11-08 2022-03-18 北京旷视科技有限公司 Pose determination method and movable equipment
CN115267796A (en) * 2022-08-17 2022-11-01 深圳市普渡科技有限公司 Positioning method, positioning device, robot and storage medium
CN117073690A (en) * 2023-10-17 2023-11-17 山东大学 Navigation method and system based on multi-map strategy

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107991683A (en) * 2017-11-08 2018-05-04 华中科技大学 A kind of robot autonomous localization method based on laser radar
CN109579849A (en) * 2019-01-14 2019-04-05 浙江大华技术股份有限公司 Robot localization method, apparatus and robot and computer storage medium
US20190146062A1 (en) * 2017-11-15 2019-05-16 Baidu Online Network Technology (Beijing) Co., Ltd Laser point cloud positioning method and system
CN111077495A (en) * 2019-12-10 2020-04-28 亿嘉和科技股份有限公司 Positioning recovery method based on three-dimensional laser
CN111337011A (en) * 2019-12-10 2020-06-26 亿嘉和科技股份有限公司 Indoor positioning method based on laser and two-dimensional code fusion
CN111443359A (en) * 2020-03-26 2020-07-24 达闼科技成都有限公司 Positioning method, device and equipment
CN111508021A (en) * 2020-03-24 2020-08-07 广州视源电子科技股份有限公司 Pose determination method and device, storage medium and electronic equipment
CN111536964A (en) * 2020-07-09 2020-08-14 浙江大华技术股份有限公司 Robot positioning method and device, and storage medium
CN111582566A (en) * 2020-04-26 2020-08-25 上海高仙自动化科技发展有限公司 Path planning method and planning device, intelligent robot and storage medium
WO2020211655A1 (en) * 2019-04-17 2020-10-22 北京迈格威科技有限公司 Laser coarse registration method, device, mobile terminal and storage medium
CN112123343A (en) * 2020-11-25 2020-12-25 炬星科技(深圳)有限公司 Point cloud matching method, point cloud matching equipment and storage medium
CN112612029A (en) * 2020-12-24 2021-04-06 哈尔滨工业大学芜湖机器人产业技术研究院 Grid map positioning method fusing NDT and ICP
CN112698345A (en) * 2020-12-04 2021-04-23 江苏科技大学 Robot simultaneous positioning and mapping optimization method for laser radar
CN112862874A (en) * 2021-04-23 2021-05-28 腾讯科技(深圳)有限公司 Point cloud data matching method and device, electronic equipment and computer storage medium
WO2021104497A1 (en) * 2019-11-29 2021-06-03 广州视源电子科技股份有限公司 Positioning method and system based on laser radar, and storage medium and processor
CN112904358A (en) * 2021-01-21 2021-06-04 中国人民解放军军事科学院国防科技创新研究院 Laser positioning method based on geometric information
CN112904369A (en) * 2021-01-14 2021-06-04 深圳市杉川致行科技有限公司 Robot repositioning method, device, robot and computer-readable storage medium
CN112923933A (en) * 2019-12-06 2021-06-08 北理慧动(常熟)车辆科技有限公司 Laser radar SLAM algorithm and inertial navigation fusion positioning method

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107991683A (en) * 2017-11-08 2018-05-04 华中科技大学 A kind of robot autonomous localization method based on laser radar
US20190146062A1 (en) * 2017-11-15 2019-05-16 Baidu Online Network Technology (Beijing) Co., Ltd Laser point cloud positioning method and system
CN109579849A (en) * 2019-01-14 2019-04-05 浙江大华技术股份有限公司 Robot localization method, apparatus and robot and computer storage medium
WO2020211655A1 (en) * 2019-04-17 2020-10-22 北京迈格威科技有限公司 Laser coarse registration method, device, mobile terminal and storage medium
WO2021104497A1 (en) * 2019-11-29 2021-06-03 广州视源电子科技股份有限公司 Positioning method and system based on laser radar, and storage medium and processor
CN112923933A (en) * 2019-12-06 2021-06-08 北理慧动(常熟)车辆科技有限公司 Laser radar SLAM algorithm and inertial navigation fusion positioning method
CN111077495A (en) * 2019-12-10 2020-04-28 亿嘉和科技股份有限公司 Positioning recovery method based on three-dimensional laser
CN111337011A (en) * 2019-12-10 2020-06-26 亿嘉和科技股份有限公司 Indoor positioning method based on laser and two-dimensional code fusion
CN111508021A (en) * 2020-03-24 2020-08-07 广州视源电子科技股份有限公司 Pose determination method and device, storage medium and electronic equipment
CN111443359A (en) * 2020-03-26 2020-07-24 达闼科技成都有限公司 Positioning method, device and equipment
CN111582566A (en) * 2020-04-26 2020-08-25 上海高仙自动化科技发展有限公司 Path planning method and planning device, intelligent robot and storage medium
CN111536964A (en) * 2020-07-09 2020-08-14 浙江大华技术股份有限公司 Robot positioning method and device, and storage medium
CN112123343A (en) * 2020-11-25 2020-12-25 炬星科技(深圳)有限公司 Point cloud matching method, point cloud matching equipment and storage medium
CN112698345A (en) * 2020-12-04 2021-04-23 江苏科技大学 Robot simultaneous positioning and mapping optimization method for laser radar
CN112612029A (en) * 2020-12-24 2021-04-06 哈尔滨工业大学芜湖机器人产业技术研究院 Grid map positioning method fusing NDT and ICP
CN112904369A (en) * 2021-01-14 2021-06-04 深圳市杉川致行科技有限公司 Robot repositioning method, device, robot and computer-readable storage medium
CN112904358A (en) * 2021-01-21 2021-06-04 中国人民解放军军事科学院国防科技创新研究院 Laser positioning method based on geometric information
CN112862874A (en) * 2021-04-23 2021-05-28 腾讯科技(深圳)有限公司 Point cloud data matching method and device, electronic equipment and computer storage medium

Non-Patent Citations (12)

* Cited by examiner, † Cited by third party
Title
危双丰 等: "基于激光雷达的同时定位与地图构建方法综述", 《计算机应用研究》 *
危双丰 等: "基于激光雷达的同时定位与地图构建方法综述", 《计算机应用研究》, no. 02, 31 December 2020 (2020-12-31), pages 13 - 18 *
吴华 等: "基于增量式路标表观学习的移动机器人定位", 北京航空航天大学学报, no. 06, pages 81 - 85 *
周凯月 等: "融合反光柱的2D激光SLAM和高精度定位***", 《现代计算机》 *
周凯月 等: "融合反光柱的2D激光SLAM和高精度定位***", 《现代计算机》, no. 11, 30 April 2020 (2020-04-30), pages 4 - 8 *
李鑫 等: "基于多分辨率搜索与多点云密度匹配的快速ICP-SLAM方法", 《机器人》 *
李鑫 等: "基于多分辨率搜索与多点云密度匹配的快速ICP-SLAM方法", 《机器人》, no. 05, 31 December 2020 (2020-12-31), pages 73 - 84 *
王美玲 等: "用于自主车导航的快速室内地图构建方法", 红外与激光工程, no. 3, pages 265 - 270 *
纪嘉文 等: "一种基于多传感融合的室内建图和定位算法", 《成都信息工程大学学报》 *
纪嘉文 等: "一种基于多传感融合的室内建图和定位算法", 《成都信息工程大学学报》, no. 04, 31 August 2018 (2018-08-31), pages 51 - 58 *
蒋秉川 等: "机器人超高分辨率立体网格导航地图建模研究", 《***仿真学报》 *
蒋秉川 等: "机器人超高分辨率立体网格导航地图建模研究", 《***仿真学报》, no. 11, 30 November 2017 (2017-11-30), pages 112 - 119 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113978512A (en) * 2021-11-03 2022-01-28 北京埃福瑞科技有限公司 Rail train positioning method and device
CN113978512B (en) * 2021-11-03 2023-11-24 北京埃福瑞科技有限公司 Rail train positioning method and device
CN114199233A (en) * 2021-11-08 2022-03-18 北京旷视科技有限公司 Pose determination method and movable equipment
CN114199233B (en) * 2021-11-08 2024-04-05 北京旷视科技有限公司 Pose determining method and movable equipment
CN115267796A (en) * 2022-08-17 2022-11-01 深圳市普渡科技有限公司 Positioning method, positioning device, robot and storage medium
CN115267796B (en) * 2022-08-17 2024-04-09 深圳市普渡科技有限公司 Positioning method, positioning device, robot and storage medium
CN117073690A (en) * 2023-10-17 2023-11-17 山东大学 Navigation method and system based on multi-map strategy
CN117073690B (en) * 2023-10-17 2024-03-15 山东大学 Navigation method and system based on multi-map strategy

Also Published As

Publication number Publication date
CN113432533B (en) 2023-08-15

Similar Documents

Publication Publication Date Title
CN113432533A (en) Robot positioning method and device, robot and storage medium
CN110632921B (en) Robot path planning method and device, electronic equipment and storage medium
US11187790B2 (en) Laser scanning system, laser scanning method, movable laser scanning system, and program
CN109490825B (en) Positioning navigation method, device, equipment, system and storage medium
US20180113234A1 (en) System and method for obstacle detection
CN111258320B (en) Robot obstacle avoidance method and device, robot and readable storage medium
JP2012225806A (en) Road gradient estimation device and program
CN107843252B (en) Navigation path optimization method and device and electronic equipment
CN110674705A (en) Small-sized obstacle detection method and device based on multi-line laser radar
US20200064481A1 (en) Autonomous mobile device, control method and storage medium
CN111308500A (en) Obstacle sensing method and device based on single-line laser radar and computer terminal
CN114815851A (en) Robot following method, robot following device, electronic device, and storage medium
CN111402160A (en) Point cloud data denoising method, device, equipment and storage medium
CN112686951A (en) Method, device, terminal and storage medium for determining robot position
CN113768419B (en) Method and device for determining sweeping direction of sweeper and sweeper
CN113376638A (en) Unmanned logistics trolley environment sensing method and system
CN113625232A (en) Method, device, medium and equipment for suppressing multipath false target in radar detection
CN114488178A (en) Positioning method and device
CN111781606A (en) Novel miniaturization implementation method for fusion of laser radar and ultrasonic radar
CN111780744A (en) Mobile robot hybrid navigation method, equipment and storage device
CN114147707B (en) Robot docking method and device based on visual identification information
CN115507840A (en) Grid map construction method, grid map construction device and electronic equipment
CN112344966B (en) Positioning failure detection method and device, storage medium and electronic equipment
US11267130B2 (en) Robot localization method and apparatus and robot using the same
CN113203424A (en) Multi-sensor data fusion method and device and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant