CN115685236A - Robot, robot skid processing method, device and readable storage medium - Google Patents

Robot, robot skid processing method, device and readable storage medium Download PDF

Info

Publication number
CN115685236A
CN115685236A CN202110876987.3A CN202110876987A CN115685236A CN 115685236 A CN115685236 A CN 115685236A CN 202110876987 A CN202110876987 A CN 202110876987A CN 115685236 A CN115685236 A CN 115685236A
Authority
CN
China
Prior art keywords
robot
increment
motion data
current
particles
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110876987.3A
Other languages
Chinese (zh)
Inventor
朱俊安
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Pudu Technology Co Ltd
Original Assignee
Shenzhen Pudu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Pudu Technology Co Ltd filed Critical Shenzhen Pudu Technology Co Ltd
Priority to CN202110876987.3A priority Critical patent/CN115685236A/en
Publication of CN115685236A publication Critical patent/CN115685236A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

A robot skid handling method comprising: synchronizing first motion data acquired by the wheel type odometer with second motion data acquired by the laser radar, wherein the first motion data comprises a first translation increment and a first rotation angular velocity increment of the robot, and the second motion data comprises a second translation increment and a second rotation angular velocity increment of the robot; judging whether the robot slips or not according to a first difference value and a second difference value at the same moment, wherein the first difference value is a difference value between a first translation increment and a second translation increment, and the second difference value is a difference value between a first rotation angular velocity increment and a second rotation angular velocity increment; and if the robot slips, the pose of the robot is determined again by using the point cloud data acquired by the laser radar through a preset algorithm. The technical scheme of this application can make the robot after skidding pinpoint again rapidly.

Description

Robot, robot slip processing method, device and readable storage medium
Technical Field
The invention relates to the field of robots, in particular to a robot, a robot skid processing method, a robot skid processing device and a readable storage medium.
Background
In the running process of the robot, if the ground is wet, wheels are blocked by foreign matters, the robot passes through the ridges and the like, the robot may slip. The robot slips to enable the odometer carried by the robot to output wrong data, so that the robot is low in drawing and positioning quality and even wrong, and the problem of robot motion control can be caused. When the robot has intelligent recognition and handles skidding, the impact of skidding on the positioning and mapping system is reduced. The existing robot slipping processing method still cannot enable the robot to be repositioned after the robot is judged to slip.
Disclosure of Invention
The application provides a robot, a robot slip processing method, a device and a readable storage medium, which are used for overcoming the defect that the robot cannot be repositioned after the robot slips in the existing robot slip processing method.
In one aspect, the present application provides a robot carrying a wheeled odometer and a lidar, the robot comprising:
a memory and a processor;
the memory stores executable program code;
the processor, coupled to the memory, invokes executable program code stored in the memory to perform a robot skid handling method comprising:
synchronizing first motion data acquired by the wheeled odometer with second motion data acquired by the lidar, the first motion data including a first translational increment and a first angular velocity increment of the robot, the second motion data including a second translational increment and a second angular velocity increment of the robot;
judging whether the robot slips or not according to a first difference value and/or a second difference value at the same moment, wherein the first difference value is a difference value between the first translation increment and the second translation increment, and the second difference value is a difference value between the first rotation angular velocity increment and the second rotation angular velocity increment;
and if the robot slips, re-determining the current pose of the robot by using a preset algorithm through point cloud data acquired by a laser radar.
In another aspect, the present application provides a robot skid handling apparatus comprising:
a synchronization module to synchronize first motion data acquired by the wheel odometer with second motion data acquired by the lidar, the first motion data including a first translational increment and a first angular velocity increment of the robot, the second motion data including a second translational increment and a second angular velocity increment of the robot;
a judging module, configured to judge whether the robot slips according to a first difference and/or a second difference at the same time, where the first difference is a difference between the first translational increment and a second translational increment, and the second difference is a difference between the first rotational angular velocity increment and the second rotational angular velocity increment;
and the determining module is used for re-determining the current pose of the robot by using a preset algorithm through point cloud data acquired by a laser radar if the robot slips.
In a third aspect, the present application provides a robot skid handling method, the method comprising:
synchronizing first motion data acquired by a wheel odometer with second motion data acquired by a lidar, the first motion data comprising a first translational increment and a first angular velocity increment of the robot, the second motion data comprising a second translational increment and a second angular velocity increment of the robot;
judging whether the robot slips or not according to a first difference value and/or a second difference value at the same moment, wherein the first difference value is a difference value between the first translation increment and the second translation increment, and the second difference value is a difference value between the first rotation angular velocity increment and the second rotation angular velocity increment;
and if the robot slips, re-determining the current pose of the robot by using a preset algorithm through point cloud data acquired by a laser radar.
In a fourth aspect, the present application provides a readable storage medium having stored thereon a computer program for, when executed by a processor, implementing a robot slip handling method, the robot slip handling method being the robot-implemented robot slip handling method described above.
According to the technical scheme, after the slipping of the robot is judged, the pose of the slipping robot is determined again by the aid of the preset algorithm according to the point cloud data acquired by the laser radar, and the slipping robot can be accurately positioned quickly due to the fact that the laser radar can acquire enough point cloud data and the subsequently adopted preset algorithm has the advantages of being mature in algorithm and the like in positioning.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic structural diagram of a robot provided in an embodiment of the present application;
fig. 2 is a flowchart of a robot skid handling method according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a robot skid handling device according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of an apparatus provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the present application.
In this specification, adjectives such as first and second may only be used to distinguish one element or action from another, without necessarily requiring or implying any actual such relationship or order. References to an element or component or step (etc.) should not be construed as limited to only one of the element, component, or step but rather to one or more of the element, component, or step, etc., where the context permits.
In the present specification, the sizes of the respective portions shown in the drawings are not drawn in an actual proportional relationship for the convenience of description.
Referring to fig. 1, a schematic structural diagram Of a robot according to an embodiment Of the present invention is provided, in which the robot is mounted with a wheel-type odometer and a laser radar, such as a Time Of Flight (Time Of Flight) laser radar.
For ease of illustration, only portions relevant to the embodiments of the present application are shown. The robot may include:
the robot comprises a memory 10 and a processor 20, wherein the processor 20 is a computing and control core of the robot and is a final execution unit for information processing and program operation. The memory 10 is, for example, a hard disk drive memory, a non-volatile memory (e.g., a flash memory or other electronically programmable deletion-limited memory used to form a solid state drive, etc.), a volatile memory (e.g., a static or dynamic random access memory, etc.), and the like, and the embodiments of the present application are not limited thereto.
The memory 10 has stored therein executable program code; the processor 20, coupled to the memory 10, invokes the executable program code stored in the memory 10 to perform the robot slip handling method of: synchronizing first motion data acquired by the wheel-type odometer with second motion data acquired by the laser radar; judging whether the robot slips or not according to a first difference value and/or a second difference value at the same moment, wherein the first difference value is a difference value between the first translation increment and the second translation increment; if the robot slips, re-determining the pose of the robot by using a preset algorithm through point cloud data acquired by the laser radar; wherein the first motion data comprises a first translational increment and a first rotational angular velocity increment of the robot, the second motion data comprises a second translational increment and a second rotational angular velocity increment of the robot, the first difference is a difference between the first translational increment and the second translational increment, and the second difference is a difference between the first rotational angular velocity increment and the second rotational angular velocity increment.
In an optional embodiment, the first movement data may be directly obtained by the wheel type odometer (with a processing function), or may be obtained by obtaining multi-frame raw odometer data by the wheel type odometer, or may be obtained by performing calculation processing on the multi-frame raw odometer data by the processor. Similarly, the second motion data may be data directly acquired by a laser radar (with a processing function), or may be data obtained by performing calculation processing on multiple frames of original laser radar data (point cloud data) obtained by the laser radar, which is not limited herein.
Referring to fig. 2, a robot skid processing method according to an embodiment Of the present invention is a robot skid processing method in which a wheel odometer and a laser radar, for example, a Time Of Flight (Time Of Flight) laser radar, are mounted, and the method mainly includes steps S201 to S203, and is described as follows:
step S201: and synchronizing first motion data acquired by the wheel type odometer with second motion data acquired by the laser radar, wherein the first motion data comprises a first translation increment and a first rotation angular velocity increment of the robot, and the second motion data comprises a second translation increment and a second rotation angular velocity increment of the robot.
In the embodiment of the present application, the first incremental translation of the robot includes an incremental displacement dx of the robot in the x-axis direction and an incremental displacement dy in the y-axis direction of the two-dimensional coordinate system obtained by the wheel-type odometer, the first incremental rotational angular velocity includes an incremental rotational angular velocity dq of the robot obtained by the wheel-type odometer, the second incremental translation of the robot includes an incremental displacement dx ' of the robot in the x-axis direction and an incremental displacement dy ' in the y-axis direction obtained by the laser radar, and the second incremental rotational angular velocity includes an incremental rotational angular velocity dq ' of the robot obtained by the laser radar.
Due to the fact that different sensors have different sampling frequencies even after hardware synchronization, the problem that motion data acquired by a laser radar and a wheel type odometer are not synchronous on a timestamp exists, the data asynchronism brings obstacles for the two sensors to perform data fusion subsequently, and therefore wheels need to be usedThe first motion data acquired by the odometer is synchronous with the second motion data acquired by the laser radar. Whether the wheel-type odometer or the laser radar is used, when one frame of data is output, a time stamp is corresponding to the time stamp, and the time stamp is used for recording the data generation or output time. As an embodiment of the present application, synchronizing the first motion data acquired by the wheel-type odometer with the second motion data acquired by the laser radar may be: reading a timestamp lidar of a current frame of second motion data time And a time stamp odom of a first frame in the n +1 frame first motion data time(0) And the timestamp of the last frame odom time(n) If the timestamp lidar time At the time stamp odom time(0) And timestamp odom time(n) In between, then read timestamp is odom time(a) Corresponding first motion data odom data(a) And a timestamp of odom time(b) Corresponding first motion data odom data(b) (ii) a According to the time stamp imu time Time stamp odom time(a) Time stamp odom time(b) First exercise data from data(a) And first motion data from data(b) Calculating to obtain the time stamp and the time stamp lidar by adopting an interpolation algorithm time Aligned first motion data odom time(c) Where n is a natural number greater than or equal to 1, the first motion data odom data(a) And first motion data odom data(b) The first motion data of two adjacent frames before and after the current frame of the second motion data. Considering that the linear interpolation algorithm has the advantages of simplicity, easy implementation, small calculation amount and the like, the time stamp and the time stamp lidar calculated by the interpolation algorithm in the embodiment time Aligned first motion data odom data(c) The time stamp and the time stamp lidar can be obtained by calculation by adopting a linear interpolation algorithm time Aligned first motion data odom data(c) Specifically, the timestamp and the timestamp lidar are calculated according to the following formula time Aligned first motion data odom data(c)
Figure BDA0003190685820000061
In the above embodiment, if the timestamp lidar time Out of timestamp odom time(0) And timestamp odom time(n) E.g. lidar time <odom time(0) Or odom time(n) <lidar time When it is a lidar time <odom time(0) Then, the processing of the current frame of the second motion data is abandoned until the timestamp of the frame of the second motion data acquired by the laser radar is located at the timestamp odom time(0) And timestamp odom time(n) When it is time to process again according to the above embodiment, when odom time(n) <lidar time When the wheeled odometer obtains the first movement data, waiting until the timestamp of the data in the first movement data queue of the wheeled odometer meets the odom again time(0) <lidar time <odom time(n) The process was then followed as described in the above example.
Step S202: and judging whether the robot slips or not according to a first difference value and/or a second difference value at the same moment, wherein the first difference value is a difference value between a first translation increment and a second translation increment, and the second difference value is a difference value between a first rotation angular velocity increment and a second rotation angular velocity increment.
Here, the first rotation angular velocity increment is dq, and the second rotation angular velocity increment is dq'. In an embodiment of the present application, the determining whether the robot slips according to the first difference and the second difference at the same time may be: calculating a first absolute difference value fabs-1 between the first translation increment and the second translation increment at the same moment and a second absolute difference value fabs-2 between the first rotation angular velocity increment and the second rotation angular velocity increment, and if the duration of the first absolute difference value fabs-1 being greater than a first preset threshold value exceeds a first preset time period and/or the duration of the second absolute difference value fabs-2 being greater than a second preset threshold value exceeds a second preset time period, determining that the robot slips, namely, recording the first preset threshold value as D thre_1 The second predetermined threshold is D thre_2 If | dx-dx' | = fabs-1>D thre_1 Or/and | dy-dy' | = fabs-1>D thre_1 Exceeds a first predetermined time period, and/or if | dq-dq' | = fabs-2>D thre_2 Is over a second preset time period, it is determined that the robot is slipping. Obviously, the larger fabs-1 or fabs-2, the more severe the robot slips.
In another embodiment of the present application, determining whether the robot slips according to the first difference and the second difference at the same time may be: calculating a first absolute difference fabs-1 between a first translation increment and a second translation increment at the same moment and a second absolute difference fabs-2 between a first rotation angular velocity increment and a second rotation angular velocity increment, and if the continuous frame number of multi-frame data of which the first absolute difference fabs-1 is greater than a first preset threshold exceeds a first preset frame number and/or the continuous frame number of multi-frame data of which the second absolute difference fabs-2 is greater than a second preset threshold exceeds a second preset frame number, determining that the robot slips, namely, recording the first preset threshold as D thre_1 The second predetermined threshold is D thre_2 If | dx-dx' | = fabs-1>D thre_1 Or/and | dy-dy' | = fabs-1>D thre_1 The number of continuous frames of the multi-frame data exceeds a first preset number of frames, and/or if | dq-dq' | = fabs-2>D thre_2 And determining that the robot slips if the continuous frame number of the multi-frame data exceeds a second preset frame number. Obviously, the larger fabs-1 or fabs-2, the more severe the robot slips.
Step S203: and if the robot slips, the current pose of the robot is determined again by the point cloud data acquired by the laser radar through a preset algorithm.
The robot slips, which means that there may be errors in the calculation of the pose of the robot calculated based on the sensing data output by the odometer, resulting in a series of errors in later mapping, positioning, and the like, and the pose of the robot needs to be determined again. In the embodiment of the application, the pose of the robot is determined again through the point cloud data acquired by the particle filter algorithm and the laser radar. It should be noted that, in this embodiment, the point cloud data acquired by the laser radar may include both current point cloud data acquired by the laser radar and previous historical point cloud data.
In one embodiment of the application, point cloud data acquired by laser radar is acquiredThe re-determination of the current pose of the robot by the preset algorithm may be performed by the following step S a 2031 to step S a 2035 and is realized by:
step S a 2031: and acquiring current point cloud data obtained by scanning of the laser radar and first point cloud data when the robot starts to slip.
When the robot is located at the current position, the laser radar scans the surrounding environment, the laser beam strikes an obstacle in the environment and returns, and the azimuth information of the obstacle is obtained according to the round-trip calculation of the laser beam, so that point cloud information is formed.
In an alternative embodiment, since it is necessary to continue the slip of the robot for a preset time period or a preset number of frames, and the pose of the slip of the robot is determined to have a certain inaccuracy from the time when the robot starts to slip, and the pose of the slip of the robot is relatively accurate, the current pose needs to be corrected based on the pose of the slip of the robot.
Specifically, current point cloud data obtained by scanning of the laser radar and first point cloud data when the robot starts to slip are obtained.
The wheel type odometer and the laser radar are different in frequency, and the current point cloud data is the point cloud data which corresponds to the time node when the robot determines to skid and is closest to the time node when the robot determines to skid, and possibly is the point cloud data which corresponds to the time node when the robot determines to skid. The first point cloud data is the point cloud data corresponding to the closest time node when the robot starts to slip, and may also be the point cloud data corresponding to the time node when the robot starts to slip. It is therefore also possible that the current point cloud data is point cloud data corresponding to a time node when the robot determines a slip.
Step S a 2032: and calculating the displacement data of the robot based on the first point cloud data and the current point cloud data.
And then calculating the displacement data of the robot based on the first point cloud data and the current point cloud data, namely calculating the displacement data of the robot by comparing the first point cloud data with the current point cloud data, namely calculating the displacement increment of the robot, wherein the displacement increment comprises a position increment and an orientation angle increment.
Step Sa2033: and acquiring a prediction particle swarm of the robot by utilizing the displacement data and the first particle swarm when the robot starts to slip, wherein the prediction particle swarm comprises a plurality of target sampling particles.
And then, acquiring a predicted particle swarm of the robot by utilizing the displacement data and the first particle swarm when the robot starts to slip.
In an alternative embodiment, when the robot is located, a plurality of possible particles are sampled to form a particle group, each particle in the particle group corresponds to pose information of the robot, and generally, a preset number of particles, such as 6-10, exist in one particle group, and can fluctuate within the number range. When the pose is determined, a particle with the highest confidence coefficient is selected from the multiple particles in the particle swarm, and the corresponding pose information is used as the pose information of the robot. Or the average value of the corresponding pose information of a plurality of particles with the top confidence levels can be selected from the plurality of particles to serve as the pose information of the robot.
In an alternative embodiment, the first particle swarm when the robot starts to slip is a particle swarm corresponding to a time node when the robot slips, and may also be a particle swarm corresponding to the first point cloud data.
Subsequently, the first population of particles is processed with the displacement data to obtain a predicted population of particles for the robot, the predicted population of particles including the plurality of target sampling particles.
The method specifically comprises the steps that as displacement data are position increment and orientation angle increment of the robot, a first particle swarm is calculated based on the displacement data, so that a plurality of predicted particles are determined, namely, for a plurality of first particles in the first particle swarm, each first particle can be calculated by utilizing the displacement data, and accordingly the corresponding predicted particle is obtained.
Then, a plurality of target sampling particles are randomly distributed around the plurality of prediction particles by using a gaussian model, the plurality of target sampling particles are made to fall into a non-occupied area of a grid map, that is, into a blank area of the grid map, and the target sampling particles are made to fall into an occupied area (an area where an obstacle exists) of the grid map, the plurality of target sampling particles forming the prediction particle group.
Optionally, a plurality of target sampling particles are randomly distributed around each predicted particle, and a plurality of target sampling particles corresponding to the predicted particle are in gaussian distribution, optionally, one of the plurality of target sampling particles may coincide with the original predicted particle, and the target sampling particles constitute a predicted particle group.
The sampling particles can be calculated by the following formula:
Figure BDA0003190685820000091
wherein x is t And x t-1 Is the horizontal axis coordinate, y, of the robot under the world coordinate system at the time t and the time t-1 respectively t And y t-1 The longitudinal axis coordinate, q, of the robot in the world coordinate system at the time t and the time t-1 respectively t And q is t-1 Respectively the orientation of the robot at time t and at time t-1, dx t-1 、dy t-1 And dq t-1 For displacement data (i.e. position increments and orientation angle increments),
Figure BDA0003190685820000101
and
Figure BDA0003190685820000102
is gaussian noise.
Step S a 2034: and performing confidence evaluation on the plurality of target sampling particles according to the current point cloud data to obtain the confidence of the plurality of target sampling particles.
The confidence of a particle is the degree of matching between the point cloud information obtained based on the particle and the grid map and the current point cloud data. That is, for each particle, there is a corresponding pose, so that the point cloud information observed by the pose, that is, the point cloud information seen by the robot when the robot is assumed to be located at the pose, can be obtained based on the pose and the grid map.
And then comparing the point cloud information with current point cloud data of the robot, namely the point cloud data actually observed by the robot, and if the matching degree is higher, proving that the pose corresponding to the particle is closer to the actual pose of the robot, namely proving that the confidence of the particle is higher.
Through the method, the matching degree of each particle can be evaluated, so that the confidence of the particle is obtained.
Step S a 2035: and acquiring a current particle swarm and outputting a current target pose according to the current possible poses and confidence degrees corresponding to the multiple target sampling particles.
Step S a 2035 the method may be to select a plurality of target sampling particles with confidence degrees higher than a threshold value from among the plurality of target sampling particles, or select a preset number of target sampling particles from among the plurality of target sampling particles sorted according to a descending value of the confidence degrees from large to small to form a current particle swarm.
Namely, a plurality of target sampling particles with confidence degrees larger than a threshold value are selected from a plurality of target sampling particles to form a current particle swarm, and other sampling particles are removed. Or sequencing the plurality of target sampling particles according to the descending order of the confidence degree, and selecting a preset number of target sampling particles from the confidence degree from large to small to form the current particle swarm. The specific method is not limited herein.
And then, determining the current target pose based on the average value of the poses corresponding to all the target sampling particles in the current particle swarm or determining the pose corresponding to the target sampling particle with the highest reliability in all the target sampling particles in the current particle swarm as the current target pose.
As can be seen from the robot slipping processing method illustrated in fig. 2, after the robot slipping is judged, the pose of the slipping robot is determined again through the point cloud data acquired by the particle filter algorithm and the laser radar, and since the laser radar can acquire enough point cloud data and a subsequently adopted preset algorithm (such as the particle filter algorithm) has the advantages of mature algorithm and the like in the aspect of positioning, the slipping robot can be quickly and accurately positioned again. And optionally, the method and the device are beneficial to determining whether the robot slips or not by using the point cloud data and the original odometer data acquired by the odometer, and can be used for further repositioning the robot by using the point cloud data, so that the requirement on the number of sensors is effectively reduced.
Referring to fig. 3, the device for processing robot skidding according to the embodiment of the present application may be a central processing unit of a robot or a functional module thereof, the robot is equipped with a wheel type odometer and a laser radar, and the device may include a synchronization module 301, a determination module 302, and a determination module 303, which are described in detail as follows:
a synchronization module 301, configured to synchronize first motion data acquired by the wheel type odometer with second motion data acquired by the laser radar, where the first motion data includes a first translational increment and a first rotational angular velocity increment of the robot, and the second motion data includes a second translational increment and a second rotational angular velocity increment of the robot;
a determining module 302, configured to determine whether the robot skids according to a first difference and/or a second difference at the same time, where the first difference is a difference between a first incremental translation and a second incremental translation, and the second difference is a difference between a first angular velocity increment and a second angular velocity increment;
and the determining module 303 is configured to re-determine the pose of the robot by using a preset algorithm through the point cloud data acquired by the laser radar if the robot slips.
For the principle of each module, please refer to the content corresponding to the above embodiment, which is not described herein again.
It can be known from the apparatus illustrated in fig. 3 that, after the robot is judged to slip, the pose of the slipping robot is determined again through the point cloud data acquired by the particle filter algorithm and the laser radar, and the laser radar can acquire enough point cloud data, and the particle filter algorithm has the advantages of mature algorithm and the like in the aspect of positioning, so that the slipping robot can be rapidly and accurately positioned again.
Fig. 4 is a schematic structural diagram of an apparatus provided in an embodiment of the present application. As shown in fig. 4, the apparatus 4 of this embodiment may be a robot or a module therein, and mainly includes: a processor 40, a memory 41 and a computer program 42, such as a program of a robot slip handling method, stored in the memory 41 and executable on the processor 40. The processor 40 implements the steps in the above-described robot slip processing method embodiment, such as steps S201 to S203 shown in fig. 2, when executing the computer program 42. Alternatively, the processor 40, when executing the computer program 42, implements the functions of the modules/units in the above-described device embodiments, such as the functions of the synchronization module 301, the judgment module 302, and the determination module 303 shown in fig. 4.
Illustratively, the computer program 42 of the robot slip handling method mainly comprises: synchronizing first motion data acquired by the wheel type odometer with second motion data acquired by the laser radar, wherein the first motion data comprises a first translation increment and a first rotation angular velocity increment of the robot, and the second motion data comprises a second translation increment and a second rotation angular velocity increment of the robot; judging whether the robot slips or not according to a first difference value and a second difference value at the same moment, wherein the first difference value is a difference value between a first translation increment and a second translation increment, and the second difference value is a difference value between a first rotation angular velocity increment and a second rotation angular velocity increment; and if the robot slips, the pose of the robot is determined again by using the point cloud data acquired by the laser radar through a preset algorithm. The computer program 42 may be partitioned into one or more modules/units, which are stored in the memory 41 and executed by the processor 40 to accomplish the present application. One or more of the modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 42 in the device 4. For example, the computer program 42 may be divided into functions of the synchronization module 301, the judgment module 302, and the determination module 303 (modules in the virtual device), and the specific functions of each module are as follows: a synchronization module 301, configured to synchronize first motion data acquired by the wheel type odometer with second motion data acquired by the laser radar, where the first motion data includes a first translational increment and a first rotational angular velocity increment of the robot, and the second motion data includes a second translational increment and a second rotational angular velocity increment of the robot; a determining module 302, configured to determine whether the robot skids according to a first difference and a second difference at the same time, where the first difference is a difference between a first incremental translation and a second incremental translation, and the second difference is a difference between a first angular velocity increment and a second angular velocity increment; and the determining module 303 is configured to re-determine the pose of the robot by using a preset algorithm through the point cloud data acquired by the laser radar if the robot slips.
The device 4 may include, but is not limited to, a processor 40, a memory 41. Those skilled in the art will appreciate that fig. 4 is merely an example of a device 4 and does not constitute a limitation of device 4 and may include more or fewer components than shown, or some components in combination, or different components, e.g., a computing device may also include input-output devices, network access devices, buses, etc.
The Processor 40 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 41 may be an internal storage unit of the device 4, such as a hard disk or a memory of the device 4. The memory 41 may also be an external storage device of the device 4, such as a plug-in hard disk provided on the device 4, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like. Further, the memory 41 may also include both an internal storage unit of the device 4 and an external storage device. The memory 41 is used for storing computer programs and other programs and data required by the device. The memory 41 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned functions may be distributed as required to different functional units and modules, that is, the internal structure of the apparatus may be divided into different functional units or modules to implement all or part of the functions described above. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only used for distinguishing one functional unit from another, and are not used for limiting the protection scope of the present application. For the specific working processes of the units and modules in the above-mentioned apparatus, reference may be made to the corresponding processes in the foregoing method embodiments, which are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/device and method may be implemented in other ways. For example, the above-described apparatus/device embodiments are merely illustrative, and for example, a module or a unit may be divided into only one logic function, and may be implemented in other ways, for example, a plurality of units or components may be combined or integrated into another apparatus, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a non-transitory computer readable storage medium. Based on such understanding, all or part of the flow of the method of the embodiments described above may be implemented by a computer program, where the computer program may be stored in a computer readable storage medium, and when executed by a processor, the computer program may implement the steps of the embodiments of the methods described above, that is, synchronizing first motion data acquired by a wheel type odometer with second motion data acquired by a laser radar, where the first motion data includes a first translation increment and a first rotation angular velocity increment of the robot, and the second motion data includes a second translation increment and a second rotation angular velocity increment of the robot; judging whether the robot slips or not according to a first difference value and a second difference value at the same moment, wherein the first difference value is a difference value between a first translation increment and a second translation increment, and the second difference value is a difference value between a first rotation angular velocity increment and a second rotation angular velocity increment; and if the robot slips, the pose of the robot is determined again by using particle filtering through the point cloud data acquired by the laser radar. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The non-transitory computer readable medium may include: any entity or device capable of carrying computer program code, recording medium, U.S. disk, removable hard disk, magnetic diskette, optical disk, computer Memory, read-Only Memory (ROM), random Access Memory (RAM), electrical carrier wave signal, telecommunications signal, software distribution medium, etc. It should be noted that the non-transitory computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, in accordance with legislation and patent practice, the non-transitory computer readable medium does not include electrical carrier signals and telecommunications signals. The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application. The above-mentioned embodiments, objects, technical solutions and advantages of the present application are further described in detail, it should be understood that the above-mentioned embodiments are only examples of the present application, and are not intended to limit the scope of the present application, and any modifications, equivalent substitutions, improvements and the like made within the spirit and principle of the present application should be included in the scope of the present invention.

Claims (11)

1. A robot carrying a wheel-type odometer and a lidar, the robot comprising:
a memory and a processor;
the memory stores executable program code;
the processor, coupled to the memory, invokes executable program code stored in the memory to perform a robot skid handling method comprising:
synchronizing first motion data acquired by the wheeled odometer with second motion data acquired by the lidar, the first motion data including a first translational increment and a first angular velocity increment of the robot, the second motion data including a second translational increment and a second angular velocity increment of the robot;
judging whether the robot slips or not according to a first difference value and/or a second difference value at the same moment, wherein the first difference value is a difference value between the first translation increment and the second translation increment, and the second difference value is a difference value between the first rotation angular velocity increment and the second rotation angular velocity increment;
and if the robot slips, re-determining the current pose of the robot by using a preset algorithm through point cloud data acquired by a laser radar.
2. A robot as set forth in claim 1, wherein the processor invokes executable program code stored in the memory, the step of synchronizing first motion data acquired by the wheeled odometer with second motion data acquired by the lidar comprising:
reading a timestamp lidar of the second motion data current frame time And a timestamp odom of a first frame of the n +1 frames of the first motion data time(0) And the timestamp of the last frame odom time(n) N is a natural number greater than or equal to 1;
if the lidar time Is located at the odom time(0) And said odom time(n) In between, then read timestamp is odom time(a) Corresponding first motion data odom data(a) And a timestamp of odom time(b) Corresponding first fortuneDynamic data odom data(b) The first motion data from data(a) And first motion data from data(b) First motion data of two adjacent frames before and after the current frame of the second motion data are obtained;
according to the timestamp lidar time Time stamp odom time(a) Time stamp odom time(b) First exercise data from data(a) And first motion data odom data(b) Calculating by adopting an interpolation algorithm to obtain a time stamp and the time stamp lidar time Aligned first motion data odom time(c)
3. The robot of claim 1, wherein said processor calls executable program code stored in said memory to perform the step of determining whether said robot is slipping based on a first difference and/or a second difference at the same time comprising:
calculating a first absolute difference fabs-1 between the first and second translational increments and a second absolute difference fabs-2 between the first and second rotational angular velocity increments at the same time;
and if the duration that the first absolute difference value fabs-1 is greater than a first preset threshold value exceeds a first preset time period and/or the duration that the second absolute difference value fabs-2 is greater than a second preset threshold value exceeds a second preset time period, determining that the robot slips.
4. The robot of claim 1, wherein said processor calls executable program code stored in said memory to perform the step of determining whether said robot is slipping based on a first difference and/or a second difference at the same time comprising:
calculating a first absolute difference fabs-1 between the first and second translational increments and a second absolute difference fabs-2 between the first and second rotational angular velocity increments at the same time;
and if the multi-frame data continuous frame number of which the first absolute difference fabs-1 is greater than a first preset threshold exceeds a first preset frame number and/or the multi-frame data continuous frame number of which the second absolute difference fabs-2 is greater than a second preset threshold exceeds a second preset frame number, determining that the robot slips.
5. The robot of claim 1, wherein the processor invokes executable program code stored in the memory, the executed step of re-determining the current pose of the robot using a preset algorithm from point cloud data acquired by lidar comprising:
acquiring current point cloud data obtained by scanning of the laser radar and first point cloud data when the robot starts to slip;
calculating displacement data of the robot based on the first point cloud data and the current point cloud data;
acquiring a prediction particle swarm of the robot by using the displacement data and a first particle swarm when the robot starts to slip, wherein the prediction particle swarm comprises a plurality of target sampling particles;
performing confidence evaluation on a plurality of target sampling particles according to the current point cloud data to obtain confidence of the plurality of target sampling particles;
and acquiring a current particle swarm and outputting the current target pose according to the current possible poses and the confidence degrees corresponding to the target sampling particles.
6. The robot of claim 5, wherein the first population of particles comprises a plurality of first particles, the processor invoking executable program code stored in the memory, the step of obtaining a predicted population of particles for the robot using the displacement data and the first population of particles at which the robot begins to slip being performed comprises:
calculating the first-time particle based on the pose data, and determining a plurality of predicted particles;
and utilizing a Gaussian model to randomly distribute a plurality of target sampling particles around the plurality of predicted particles so that the plurality of target sampling particles fall into the unoccupied area of the grid map.
7. The robot of claim 5, wherein said processor invokes executable program code stored in said memory to perform said steps of acquiring a current particle population and outputting said current target pose according to current possible poses and confidences corresponding to said plurality of target sample particles, comprising:
selecting a plurality of target sampling particles with confidence degrees larger than a threshold value from the plurality of target sampling particles or selecting a preset number of target sampling particles from the plurality of target sampling particles in descending order according to the confidence degrees to form the current particle swarm;
and determining the current target pose based on the average value of the poses corresponding to all the target sampling particles in the current particle swarm or determining the pose corresponding to the target sampling particle with the highest confidence level in all the target sampling particles in the current particle swarm as the current target pose.
8. The robot of claim 1, wherein the processor invokes executable program code stored in the memory, the step of executing the point cloud data acquired by lidar to re-determine the current pose of the robot using a predetermined algorithm comprises:
acquiring the current pose of the robot based on point cloud data acquired by the laser radar, wherein the point cloud data comprises current sensor observation information of the robot;
performing map matching processing based on the current sensor observation information of the robot and the current pose of the robot to obtain map information;
performing particle swarm suggestion distribution calculation based on the robot pose at the last moment, the current sensor observation information and the map information to obtain a particle swarm suggestion distribution probability value;
based on the suggested distribution probability value, carrying out particle analysis on the particles of the particle swarm according to the weight value
Resampling, namely acquiring a resampled particle swarm;
screening particles according to the weight of the particles of the resampled particle swarm, copying high-weight particles, discarding low-weight particles, and comparing and updating the maximum weight of the particle swarm;
and detecting whether the weight of the particle with the maximum weight in the particle swarm is larger than a preset weight threshold value or not, and if so, obtaining the optimal pose at the moment.
9. A robotic skid management apparatus, the apparatus comprising:
a synchronization module for synchronizing first motion data acquired by a wheeled odometer with second motion data acquired by a lidar, the first motion data including a first translational increment and a first angular velocity increment of the robot, the second motion data including a second translational increment and a second angular velocity increment of the robot;
a judging module, configured to judge whether the robot slips according to a first difference and/or a second difference at the same time, where the first difference is a difference between the first translational increment and a second translational increment, and the second difference is a difference between the first rotational angular velocity increment and the second rotational angular velocity increment;
and the determining module is used for re-determining the current pose of the robot by using a preset algorithm through the point cloud data acquired by the laser radar if the robot slips.
10. A method of robot skid handling, the method comprising:
synchronizing first motion data acquired by a wheel odometer with second motion data acquired by a lidar, the first motion data comprising a first translational increment and a first angular velocity increment of the robot, the second motion data comprising a second translational increment and a second angular velocity increment of the robot;
judging whether the robot slips or not according to a first difference value and/or a second difference value at the same moment, wherein the first difference value is a difference value between the first translation increment and the second translation increment, and the second difference value is a difference value between the first rotation angular velocity increment and the second rotation angular velocity increment;
and if the robot slips, re-determining the pose of the robot by using a preset algorithm through the point cloud data acquired by the laser radar.
11. A readable storage medium having stored thereon a computer program for, when executed by a processor, implementing a robot slip handling method, the robot slip handling method being the robot implemented robot slip handling method of any of claims 1 to 8.
CN202110876987.3A 2021-07-31 2021-07-31 Robot, robot skid processing method, device and readable storage medium Pending CN115685236A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110876987.3A CN115685236A (en) 2021-07-31 2021-07-31 Robot, robot skid processing method, device and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110876987.3A CN115685236A (en) 2021-07-31 2021-07-31 Robot, robot skid processing method, device and readable storage medium

Publications (1)

Publication Number Publication Date
CN115685236A true CN115685236A (en) 2023-02-03

Family

ID=85060151

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110876987.3A Pending CN115685236A (en) 2021-07-31 2021-07-31 Robot, robot skid processing method, device and readable storage medium

Country Status (1)

Country Link
CN (1) CN115685236A (en)

Similar Documents

Publication Publication Date Title
US11629964B2 (en) Navigation map updating method and apparatus and robot using the same
US10852139B2 (en) Positioning method, positioning device, and robot
EP3620823B1 (en) Method and device for detecting precision of internal parameter of laser radar
JP6760114B2 (en) Information processing equipment, data management equipment, data management systems, methods, and programs
US10614324B2 (en) Method and apparatus for identifying static obstacle
CN110470333B (en) Calibration method and device of sensor parameters, storage medium and electronic device
CN113340334B (en) Sensor calibration method and device for unmanned vehicle and electronic equipment
CN113218408B (en) 2Dslam method and system suitable for multi-sensor fusion of multiple terrains
CN111113422B (en) Robot positioning method and device, computer readable storage medium and robot
CN111142514B (en) Robot and obstacle avoidance method and device thereof
CN114485698B (en) Intersection guide line generation method and system
WO2022166323A1 (en) Method for determining road line, and related apparatus and device
CN114280582A (en) Calibration and calibration method and device for laser radar, storage medium and electronic equipment
CN114897669A (en) Labeling method and device and electronic equipment
CN117590362B (en) Multi-laser radar external parameter calibration method, device and equipment
CN114241448A (en) Method and device for obtaining heading angle of obstacle, electronic equipment and vehicle
CN114219770A (en) Ground detection method, ground detection device, electronic equipment and storage medium
CN114037977B (en) Road vanishing point detection method, device, equipment and storage medium
CN115685236A (en) Robot, robot skid processing method, device and readable storage medium
CN111380529A (en) Mobile equipment positioning method, device and system and mobile equipment
CN112965076B (en) Multi-radar positioning system and method for robot
CN115683093A (en) Robot, robot skid processing method, device and readable storage medium
CN114862953A (en) Mobile robot repositioning method and device based on visual features and 3D laser
CN113034538A (en) Pose tracking method and device of visual inertial navigation equipment and visual inertial navigation equipment
CN116793345A (en) Posture estimation method and device of self-mobile equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination