CN115683093A - Robot, robot skid processing method, device and readable storage medium - Google Patents

Robot, robot skid processing method, device and readable storage medium Download PDF

Info

Publication number
CN115683093A
CN115683093A CN202110878199.8A CN202110878199A CN115683093A CN 115683093 A CN115683093 A CN 115683093A CN 202110878199 A CN202110878199 A CN 202110878199A CN 115683093 A CN115683093 A CN 115683093A
Authority
CN
China
Prior art keywords
robot
motion data
current
angular velocity
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110878199.8A
Other languages
Chinese (zh)
Inventor
朱俊安
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Pudu Technology Co Ltd
Original Assignee
Shenzhen Pudu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Pudu Technology Co Ltd filed Critical Shenzhen Pudu Technology Co Ltd
Priority to CN202110878199.8A priority Critical patent/CN115683093A/en
Publication of CN115683093A publication Critical patent/CN115683093A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

A robot skid handling method comprising: synchronizing first motion data acquired by the wheel type odometer with second motion data acquired by the inertial measurement unit, wherein the first motion data comprises a first rotation angular velocity of the robot, and the second motion data comprises a second rotation angular velocity of the robot; judging whether the robot slips or not according to the difference value between the first rotation angular velocity and the second rotation angular velocity at the same moment; and if the robot slips, the pose of the robot is determined again by adopting a preset algorithm through the point cloud data acquired by the laser radar. The technical scheme of the application makes the cost of whole robot lower relatively, also makes to provide the real-time guarantee to the robot that skids position appearance again.

Description

Robot, robot skid processing method, device and readable storage medium
Technical Field
The invention relates to the field of robots, in particular to a robot, a robot skid processing method, a robot skid processing device and a readable storage medium.
Background
In the running process of the robot, if the ground is wet, wheels are blocked by foreign matters, the robot passes through the ridges and the like, the robot may slip. The robot slips to enable a odometer carried by the robot to output wrong data, so that the robot is low in drawing and positioning quality, even wrong, and the problem of robot motion control can be caused. When the robot has intelligent recognition and handles skidding, the impact of skidding on the positioning and mapping system is reduced. The existing robot slip processing method has the defects of large calculation time consumption, poor real-time performance and the like, and is difficult to apply to actual scenes.
Disclosure of Invention
The application provides a robot, a robot skid processing method, a device and a readable storage medium, which are used for overcoming the defects of high calculation time consumption, poor real-time performance and the like of the existing robot skid processing method.
In one aspect, the present application provides a robot carrying a wheeled odometer and an inertial measurement unit, the robot comprising:
a memory and a processor;
the memory stores executable program code;
said processor, coupled to said memory, invoking executable program code stored in said memory, performing a robot skid handling method comprising:
synchronizing first motion data acquired by the wheel odometer with second motion data acquired by the inertial measurement unit, the first motion data including a first angular velocity of rotation of the robot, the second motion data including a second angular velocity of rotation of the robot;
judging whether the robot slips or not according to the difference value between the first rotation angular velocity and the second rotation angular velocity at the same moment;
and if the robot slips, re-determining the current pose of the robot by using a preset algorithm through point cloud data acquired by a laser radar.
In another aspect, the present application provides a robot skid handling apparatus comprising:
a synchronization module for synchronizing first motion data acquired by a wheel odometer with second motion data acquired by an inertial measurement unit, the first motion data including a first angular velocity of rotation of the robot, the second motion data including a second angular velocity of rotation of the robot;
the judging module is used for judging whether the robot slips or not according to the difference value between the first rotation angular velocity and the second rotation angular velocity at the same moment;
and the determining module is used for re-determining the current pose of the robot by using a preset algorithm through the point cloud data acquired by the laser radar if the robot slips.
In a third aspect, the present application provides a robot skid handling method, the method comprising:
synchronizing first motion data acquired by a wheel odometer with second motion data acquired by an inertial measurement unit, the first motion data comprising a first angular velocity of rotation of the robot, the second motion data comprising a second angular velocity of rotation of the robot;
judging whether the robot slips or not according to the difference value between the first rotation angular velocity and the second rotation angular velocity at the same moment;
and if the robot slips, re-determining the current pose of the robot by using a preset algorithm through point cloud data acquired by a laser radar.
In a fourth aspect, the present application provides a readable storage medium having stored thereon a computer program for, when executed by a processor, implementing a robot slip handling method, the robot slip handling method being the robot-implemented robot slip handling method described above.
According to the technical scheme provided by the application, on one hand, compared with sensors such as a laser radar and the like, the wheel type odometer and the inertia measurement unit are cheap, so that the wheel type odometer and the inertia measurement unit are used for acquiring the first motion data and the second motion data, and the cost of the whole robot is relatively low; on the other hand, compared with the data volume acquired by sensors such as a laser radar and the like, the data volume of the first motion data and the second motion data acquired by the wheel type odometer and the inertia measurement unit is smaller, so that the time consumption for judging whether the robot slips or not is smaller according to the data, and the real-time guarantee is provided for re-determining the pose of the slipped robot by adopting a preset algorithm subsequently.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic structural diagram of a robot provided in an embodiment of the present application;
fig. 2 is a flowchart of a robot skid handling method according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a robot skid handling device according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of an apparatus provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the present application.
In this specification, adjectives such as first and second may only be used to distinguish one element or action from another, without necessarily requiring or implying any actual such relationship or order. References to an element or component or step (etc.) should not be construed as limited to only one of the element, component, or step, but rather to one or more of the element, component, or step, etc., where the context permits.
In the present specification, the sizes of the respective portions shown in the drawings are not drawn in an actual proportional relationship for the convenience of description.
Referring to fig. 1, a schematic structural diagram of a robot according to an embodiment of the present application is provided, where the robot is equipped with a wheel-type odometer and an inertia measurement unit. For ease of illustration, only portions relevant to the embodiments of the present application are shown. The robot may include:
the memory 10 and the processor 20, the processor 20 is the operation and control core of the robot, and is the final execution unit for information processing and program operation. The memory 10 is, for example, a hard disk drive memory, a non-volatile memory (e.g., a flash memory or other electronically programmable erase-limited memory used to form a solid state drive, etc.), a volatile memory (e.g., a static or dynamic random access memory, etc.), and the like, and the embodiments of the present application are not limited thereto.
The memory 10 stores executable program codes therein; the processor 20, coupled to the memory 10, calls said executable program code stored in the memory 10, performing the following robot slip handling method: synchronizing first motion data acquired by the wheel type odometer with second motion data acquired by the inertial measurement unit, wherein the first motion data comprises a first rotation angular velocity of the robot, and the second motion data comprises a second rotation angular velocity of the robot; judging whether the robot slips or not according to the difference value between the first rotation angular velocity and the second rotation angular velocity at the same moment; and if the robot slips, the current pose of the robot is determined again by the point cloud data acquired by the laser radar through a preset algorithm.
In alternative embodiments, the first movement data may be directly acquired by the wheel type odometer (with processing function), or may be obtained by the wheel type odometer after acquiring raw odometer data and further calculating and processing the raw odometer data by the processor. Similarly, the second motion data may also be data directly acquired by the inertial measurement unit, or the second motion data may also be obtained by further calculating and processing the raw inertial data by the processor after the raw inertial data is acquired by the inertial measurement unit.
Referring to fig. 2, a robot skid handling method according to an embodiment of the present application mainly includes steps S201 to S203, which are described as follows:
step S201: and synchronizing first motion data acquired by the wheel type odometer with second motion data acquired by the inertial measurement unit, wherein the first motion data comprises a first rotation angular velocity of the robot, and the second motion data comprises a second rotation angular velocity of the robot.
Due to the fact that different sensors have different sampling frequencies even after hardware synchronization, the problem that motion data acquired by the inertial measurement unit and the wheel type odometer are not synchronous on a timestamp exists, the data asynchronism brings obstacles for the two sensors to perform data fusion subsequently, and therefore first motion data acquired by the wheel type odometer and second motion data acquired by the inertial measurement unit need to be synchronous. Whether the wheel-type odometer or the inertial measurement unit is used, when one frame of data is output, a time stamp is corresponding to the time stamp and is used for recording the time when the data is generated or output. As an embodiment of the present application, synchronizing the first motion data acquired by the wheel type odometer with the second motion data acquired by the inertial measurement unit may be: reading a timestamp imu of a current frame of second motion data time And a timestamp odom of a first frame of the n +1 frames of the first motion data time(0) And the timestamp of the last frame odom time(n) If the timestamp imu time At the time stamp odom time(0) And timestamp odom time(n) Then read the timestamp as odom time(a) Corresponding first motion data odom data(a) And a timestamp of odom time(b) Corresponding first motion data odom data(b) (ii) a According to the time stamp imu time Time stamp odom time(a) Time stamp odom time(b) First exercise data odom data(a) And first motion data odom data(b) Calculating to obtain the time stamp and the time stamp imu by adopting an interpolation algorithm time Aligned first motion data odom time(c) Where n is a natural number greater than or equal to 1, the first motion data odom data(a) And first motion data from data(b) The first motion data of two adjacent frames before and after the current frame of the second motion data. Considering that the linear interpolation algorithm is simple and easy to implement, and the calculation amount is small, the time stamp and the time stamp imu calculated by the interpolation algorithm in the above embodiment are obtained time Aligned first motion data odom data(c) The time stamp and the time stamp imu can be obtained by calculation by adopting a linear interpolation algorithm time Aligned first motion data odom data(c) Specifically, the time stamp and the time stamp imu are calculated according to the following formula time Aligned first motion data odom time(c)
Figure BDA0003190659150000051
In the above embodiment, if the time stamp imu time Out of timestamp odom time(0) And timestamp odom time(n) E.g. imu time <odom time(0) Or odom time(n) <imu time When imu is reached time <odom time(0) Then, the processing of the current frame of the second motion data is abandoned until the time stamp of the frame of the second motion data acquired by the inertial measurement unit is positioned at the time stamp odom time(0) And timestamp odom time(n) When it is processed as in the above embodiment, when odom time(n) <imu time When the wheeled odometer obtains the first movement data, waiting until the timestamp of the data in the first movement data queue of the wheeled odometer meets the odom again time(0) <imu time <odom time(n) The process was then followed as described in the above example.
Step S202: and judging whether the robot slips or not according to the difference value between the first rotation angular velocity and the second rotation angular velocity at the same moment.
Here, the first rotational angular velocity is AngleVelOdom, and the second rotational angular velocity is AngleVelImu. In the embodiment of the present application, the determining whether the robot slips or not based on the difference between the first rotational angular velocity and the second rotational angular velocity at the same time may be: calculating an absolute difference fabs between the first rotation angular velocity and the second rotation angular velocity at the same moment, and if the absolute difference fabs is greater than a preset threshold value, the duration of which exceeds a preset time period, or the number of continuous frames of multi-frame data of which the absolute difference fabs is greater than a preset threshold value, exceeds a preset number of frames, determining that the robot slips, namely, setting the preset threshold value as D thre If | AngleVelOdom-AngleVelImu | = fabs>D thre And the duration exceeds a preset time period or fabs>D thre And if the continuous frame number of the multi-frame data exceeds the preset frame number, determining that the robot slips. Obviously, the larger the fabs, the more severe the robot slips.
Step S203: and if the robot slips, the current pose of the robot is determined again by the point cloud data acquired by the laser radar through a preset algorithm.
The robot slips, which means that there may be an error in the calculation of the pose of the robot calculated based on the sensing data output by the odometer, resulting in a series of errors in later mapping, positioning, and the like, and the pose of the robot needs to be determined again. In the embodiment of the application, the current pose of the robot is determined again by the point cloud data acquired by the laser radar through a preset algorithm. It should be noted that, in this embodiment, the point cloud data acquired by the laser radar may include both current point cloud data acquired by the laser radar and previous historical point cloud data.
In an embodiment of the application, the method for re-determining the current pose of the robot by using the preset algorithm through the point cloud data acquired by the laser radar can be implemented through the following steps Sa2031 to Sa2035:
step Sa2031: and acquiring current point cloud data obtained by scanning of the laser radar and first point cloud data when the robot starts to slip.
When the robot is located at the current position, the laser radar scans the surrounding environment, the laser beam strikes an obstacle in the environment and returns, and the azimuth information of the obstacle is obtained according to the round-trip calculation of the laser beam, so that point cloud information is formed.
In an alternative embodiment, since it is necessary to continue the slip of the robot for a preset time period or a preset number of frames, and the pose of the slip of the robot is determined to have a certain inaccuracy from the time when the robot starts to slip, and the pose of the slip of the robot is relatively accurate, the current pose needs to be corrected based on the pose of the slip of the robot.
Specifically, current point cloud data obtained by scanning of the laser radar and first point cloud data when the robot starts to slip are obtained.
The current point cloud data is the point cloud data with the nearest time node when the corresponding robot determines to skid, and the first point cloud data is the point cloud data with the nearest time node when the corresponding robot starts to skid. Due to the fact that the frequencies of the inertia measurement unit, the odometer and the laser radar are different, time nodes corresponding to sampling of the inertia measurement unit, the odometer and the laser radar are different.
Step Sa2032: and calculating the displacement data of the robot based on the first point cloud data and the current point cloud data.
And then calculating displacement data of the robot based on the first point cloud data and the current point cloud data, namely calculating the displacement data of the robot by comparing the first point cloud data with the current point cloud data, namely calculating the displacement increment of the robot, wherein the displacement increment comprises a position increment and an orientation angle increment.
Step Sa2033: and acquiring a prediction particle swarm of the robot by utilizing the displacement data and the first particle swarm when the robot starts to slip, wherein the prediction particle swarm comprises a plurality of target sampling particles.
And then, acquiring a predicted particle swarm of the robot by utilizing the displacement data and the first particle swarm when the robot starts to slip.
In an alternative embodiment, when the robot is located, a plurality of possible particles are sampled to form a particle group, each particle in the particle group corresponds to pose information of the robot, and generally, a preset number of particles, such as 6-10, exist in one particle group, and can fluctuate within the number range. When the pose is determined, a particle with the highest confidence coefficient is selected from a plurality of particles in the particle swarm, and the corresponding pose information is used as the pose information of the robot. Or an average value of the corresponding pose information of several particles with the top confidence levels can be selected from the multiple particles to serve as the pose information of the robot.
In an alternative embodiment, the first particle swarm when the robot starts to slip is a particle swarm corresponding to a time node when the robot slips, and the first particle swarm can also be a particle swarm corresponding to first point cloud data.
Subsequently, the first population of particles is processed with the displacement data to obtain a predicted population of particles for the robot, the predicted population of particles including the plurality of target sampling particles.
The method specifically comprises the steps that as displacement data are position increment and orientation angle increment of the robot, a first particle swarm is calculated based on the displacement data, so that a plurality of predicted particles are determined, namely, for a plurality of first particles in the first particle swarm, each first particle can be calculated by utilizing the displacement data, and accordingly the corresponding predicted particle is obtained.
Then, a plurality of target sampling particles are randomly distributed around the plurality of prediction particles by using a gaussian model, the plurality of target sampling particles are made to fall into a non-occupied area of a grid map, that is, into a blank area of the grid map, and the target sampling particles are made to fall into an occupied area (an area where an obstacle exists) of the grid map, the plurality of target sampling particles forming the prediction particle group.
Optionally, a plurality of target sampling particles are randomly distributed around each predicted particle, and a plurality of target sampling particles corresponding to the predicted particle are in gaussian distribution, optionally, one of the plurality of target sampling particles may coincide with the original predicted particle, and the target sampling particles constitute a predicted particle group.
The sampling particles can be calculated by the following formula:
Figure BDA0003190659150000081
wherein x is t And x t-1 Is the horizontal axis coordinate, y, of the robot under the world coordinate system at the time t and the time t-1 respectively t And y t-1 The longitudinal axis coordinate, q, of the robot in the world coordinate system at the time t and the time t-1 respectively t And q is t-1 Respectively the orientation of the robot at time t and at time t-1, dx t-1 、dy t-1 And dq t-1 For displacement data (i.e. position increments and orientation angle increments),
Figure BDA0003190659150000082
and
Figure BDA0003190659150000083
is gaussian noise.
Step Sa2034: and performing confidence evaluation on the plurality of target sampling particles according to the current point cloud data to obtain the confidence of the plurality of target sampling particles.
The confidence of a particle is the degree of matching between the point cloud information obtained based on the particle and the grid map and the current point cloud data. That is, for each particle, there is a corresponding pose, so that the point cloud information observed by the pose, that is, the point cloud information seen by the robot when the robot is assumed to be located at the pose, can be obtained based on the pose and the grid map.
And then comparing the point cloud information with current point cloud data of the robot, namely the point cloud data actually observed by the robot, and if the matching degree is higher, proving that the corresponding pose of the particle is closer to the actual pose of the robot, namely proving that the confidence coefficient of the particle is higher.
By the method, the matching degree of each particle can be evaluated, so that the confidence of the particle is obtained.
Step Sa2035: and acquiring a current particle swarm and outputting a current target pose according to the current possible poses and the confidence degrees corresponding to the target sampling particles.
Step Sa2035 may be implemented by selecting a plurality of target sampling particles with confidence degrees higher than a threshold value from among the plurality of target sampling particles, or selecting a preset number of target sampling particles from among the plurality of target sampling particles sorted according to a descending value of the confidence degrees from large to small to form a current particle swarm.
Namely, a plurality of target sampling particles with confidence degrees larger than a threshold value are selected from a plurality of target sampling particles to form a current particle swarm, and other sampling particles are removed. Or sequencing the target sampling particles according to the descending order of the confidence degree, and selecting a preset number of target sampling particles from the confidence degree from large to small to form the current particle swarm. The specific method is not limited herein.
And then, determining the current target pose based on the average value of the poses corresponding to all the target sampling particles in the current particle swarm or determining the pose corresponding to the target sampling particle with the highest reliability in all the target sampling particles in the current particle swarm as the current target pose.
As can be seen from the robot skid processing method illustrated in fig. 2, on one hand, the wheel type odometer and the inertial measurement unit are cheap relative to the sensor such as a laser radar, and therefore, the present application uses the wheel type odometer and the inertial measurement unit to acquire the first motion data and the second motion data, so that the cost of the whole robot is relatively low; on the other hand, compared with the data volume acquired by sensors such as a laser radar and the like, the data volume of the first motion data and the second motion data acquired by the wheel type odometer and the inertia measurement unit is smaller, so that the time consumption for judging whether the robot slips or not is smaller according to the data, and real-time guarantee is provided for re-determining the pose of the slipped robot by a subsequently adopted preset algorithm (such as a particle filter algorithm).
Referring to fig. 3, the device for processing robot skidding according to the embodiment of the present application may be a central processing unit of a robot or a functional module thereof, the robot is equipped with a wheel type odometer and an inertia measurement unit, and the device may include a synchronization module 301, a determination module 302, and a determination module 303, which are described in detail as follows:
a synchronization module 301, configured to synchronize first motion data acquired by the wheel type odometer with second motion data acquired by the inertial measurement unit, where the first motion data includes a first rotation angular velocity of the robot, and the second motion data includes a second rotation angular velocity of the robot;
a determining module 302, configured to determine whether the robot skids according to a difference between the first rotational angular velocity and the second rotational angular velocity at the same time;
the determining module 303 is configured to, if the robot slips, re-determine the current pose of the robot according to the point cloud data acquired by the laser radar by using a preset algorithm.
For the principle of each module, please refer to the content corresponding to the above embodiment, which is not described herein again.
As can be seen from the device illustrated in fig. 3, on one hand, the wheeled odometer and the inertial measurement unit are cheap relative to the sensor such as the lidar, and therefore, the wheeled odometer and the inertial measurement unit are used for acquiring the first motion data and the second motion data, so that the cost of the whole robot is relatively low; on the other hand, compared with the data volume acquired by sensors such as a laser radar and the like, the data volume of the first motion data and the second motion data acquired by the wheel type odometer and the inertia measurement unit is small, so that the time consumption for judging whether the robot slips or not is small according to the data, and the real-time guarantee is provided for re-determining the pose of the slipped robot by adopting a preset algorithm subsequently.
Fig. 4 is a schematic structural diagram of an apparatus provided in an embodiment of the present application. As shown in fig. 4, the apparatus 4 of this embodiment may be a robot or a module thereof, and mainly includes: a processor 40, a memory 41 and a computer program 42, such as a program of a robot slip handling method, stored in the memory 41 and executable on the processor 40. The processor 40 implements the steps in the above-described robot slip processing method embodiment, such as steps S201 to S203 shown in fig. 2, when executing the computer program 42. Alternatively, the processor 40, when executing the computer program 42, implements the functions of the modules/units in the above-described device embodiments, such as the functions of the synchronization module 301, the judgment module 302, and the determination module 303 shown in fig. 4.
Illustratively, the computer program 42 of the robot slip handling method mainly comprises: synchronizing first motion data acquired by the wheel type odometer with second motion data acquired by the inertial measurement unit, wherein the first motion data comprises a first rotation angular velocity of the robot, and the second motion data comprises a second rotation angular velocity of the robot; judging whether the robot slips or not according to the difference value between the first rotation angular velocity and the second rotation angular velocity at the same moment; and if the robot slips, re-determining the current pose of the robot by using a preset algorithm through point cloud data acquired by a laser radar. The computer program 42 may be partitioned into one or more modules/units, which are stored in the memory 41 and executed by the processor 40 to accomplish the present application. One or more of the modules/units may be a series of computer program instruction segments capable of performing specific functions that describe the execution of the computer program 42 in the device 4. For example, the computer program 42 may be divided into functions of the synchronization module 301, the judgment module 302, and the determination module 303 (modules in the virtual device), and the specific functions of each module are as follows: a synchronization module 301, configured to synchronize first motion data acquired by the wheel type odometer with second motion data acquired by the inertial measurement unit, where the first motion data includes a first rotation angular velocity of the robot, and the second motion data includes a second rotation angular velocity of the robot; a judging module 302, configured to judge whether the robot skids according to a difference between the first rotational angular velocity and the second rotational angular velocity at the same time; and the determining module 303 is configured to re-determine the current pose of the robot by using a preset algorithm through point cloud data acquired by a laser radar if the robot slips.
The device 4 may include, but is not limited to, a processor 40, a memory 41. Those skilled in the art will appreciate that fig. 4 is merely an example of a device 4 and does not constitute a limitation of device 4 and may include more or fewer components than shown, or some components in combination, or different components, e.g., a computing device may also include input-output devices, network access devices, buses, etc.
The Processor 40 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 41 may be an internal storage unit of the device 4, such as a hard disk or a memory of the device 4. The memory 41 may also be an external storage device of the device 4, such as a plug-in hard disk provided on the device 4, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like. Further, the memory 41 may also include both an internal storage unit of the device 4 and an external storage device. The memory 41 is used for storing computer programs and other programs and data required by the device. The memory 41 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned functions may be distributed as required to different functional units and modules, that is, the internal structure of the apparatus may be divided into different functional units or modules to implement all or part of the functions described above. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the above-mentioned apparatus may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/device and method may be implemented in other ways. For example, the above-described apparatus/device embodiments are merely illustrative, and for example, a module or a unit may be divided into only one logic function, and may be implemented in other ways, for example, a plurality of units or components may be combined or integrated into another apparatus, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may also be implemented in the form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a non-transitory computer readable storage medium. Based on such understanding, the present application may implement all or part of the processes in the methods of the above embodiments, and may also be implemented by instructing relevant hardware by a computer program, where the computer program of the robot slip processing method may be stored in a computer readable storage medium, and when being executed by a processor, the computer program may implement the steps of the above embodiments of the methods, that is, synchronizing first motion data acquired by a wheel type odometer with second motion data acquired by an inertial measurement unit, where the first motion data includes a first rotational angular velocity of the robot, and the second motion data includes a second rotational angular velocity of the robot; judging whether the robot slips or not according to the difference value between the first rotation angular velocity and the second rotation angular velocity at the same moment; and if the robot slips, the pose of the robot is determined again by adopting a preset algorithm through the point cloud data acquired by the laser radar. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The non-transitory computer readable medium may include: any entity or device capable of carrying computer program code, recording medium, U.S. disk, removable hard disk, magnetic diskette, optical disk, computer Memory, read-Only Memory (ROM), random Access Memory (RAM), electrical carrier wave signal, telecommunications signal, software distribution medium, etc. It should be noted that the non-transitory computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, non-transitory computer readable media does not include electrical carrier signals and telecommunications signals as subject to legislation and patent practice. The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application. The above-mentioned embodiments, objects, technical solutions and advantages of the present application are described in further detail, it should be understood that the above-mentioned embodiments are merely exemplary embodiments of the present application, and are not intended to limit the scope of the present application, and any modifications, equivalent substitutions, improvements and the like made within the spirit and principle of the present application should be included in the scope of the present invention.

Claims (10)

1. A robot carrying a wheel odometer and an inertial measurement unit, the robot comprising:
a memory and a processor;
the memory stores executable program code;
the processor, coupled to the memory, invokes executable program code stored in the memory to perform a robot skid handling method comprising:
synchronizing first motion data acquired by the wheel odometer with second motion data acquired by the inertial measurement unit, the first motion data including a first angular velocity of rotation of the robot, the second motion data including a second angular velocity of rotation of the robot;
judging whether the robot slips or not according to the difference value between the first rotation angular velocity and the second rotation angular velocity at the same moment;
and if the robot slips, re-determining the current pose of the robot by using a preset algorithm through point cloud data acquired by a laser radar.
2. The robot of claim 1, wherein the processor invokes executable program code stored in the memory, the step of synchronizing first motion data acquired by the wheeled odometer with second motion data acquired by the inertial measurement unit comprising:
reading a timestamp imu of the second motion data current frame time And the timestamp odom of the first frame in the n +1 frames of the first motion data time(0) And the timestamp of the last frame odom time(n) N is a natural number greater than or equal to 1;
if the imu time Is located at the odom time(0) And the odom time(n) In between, then read timestamp is odom time(a) Corresponding first motion data odom data(a) And a timestamp of odom time(b) Corresponding first motion data odom data(b) The first motion data from data(a) And first motion data from data(b) First motion data of two adjacent frames before and after the current frame of the second motion data;
according to the timestamp imu time Time stamp odom time(a) Time stamp odom time(b) First exercise data odom data(a) And first motion data from data(b) Calculating by adopting an interpolation algorithm to obtain a time stamp and the time stamp imu time Aligned first motion data odom time(c)
3. The robot of claim 1, wherein said processor invokes executable program code stored in said memory to perform the step of determining whether said robot is slipping based on a difference between said first angular velocity and said second angular velocity at the same time comprising:
calculating an absolute difference fabs between the first rotational angular velocity and the second rotational angular velocity at the same time;
and if the duration time of the absolute difference fabs is longer than a preset threshold value and exceeds a preset time period or the number of multi-frame data continuous frames of which the absolute difference fabs is longer than a preset threshold value and exceeds a preset number of frames, determining that the robot slips.
4. The robot of claim 1, wherein the processor invokes executable program code stored in the memory, the executed step of re-determining the current pose of the robot using a preset algorithm from point cloud data acquired by lidar comprising:
acquiring current point cloud data obtained by scanning of the laser radar and first point cloud data when the robot starts to slip;
calculating displacement data of the robot based on the first point cloud data and the current point cloud data;
acquiring a prediction particle swarm of the robot by using the displacement data and a first particle swarm when the robot starts to slip, wherein the prediction particle swarm comprises a plurality of target sampling particles;
performing confidence evaluation on a plurality of target sampling particles according to the current point cloud data to obtain confidence of the plurality of target sampling particles;
and acquiring a current particle swarm and outputting the current target pose according to the current possible poses and the confidence degrees corresponding to the target sampling particles.
5. The robot of claim 4, wherein the first population of particles comprises a plurality of first particles, the processor invoking executable program code stored in the memory, the step of obtaining a predicted population of particles for the robot using the displacement data and the first population of particles at which the robot begins to slip, performed comprises:
calculating the first particle swarm based on the displacement data, determining a plurality of predicted particles;
randomly distributing a plurality of target sampling particles around the plurality of prediction particles by using a Gaussian model, and enabling the plurality of target sampling particles to fall into a non-occupied area of a grid map, wherein the plurality of target sampling particles form the prediction particle swarm.
6. The robot of claim 4, wherein said processor invokes executable program code stored in said memory, said performing steps of obtaining a current population of particles and outputting said current target pose based on current possible poses and confidences corresponding to said plurality of target sample particles comprising:
selecting a plurality of target sampling particles with confidence degrees larger than a threshold value from the plurality of target sampling particles or selecting a preset number of target sampling particles from the plurality of target sampling particles which are sorted according to the descending values of the confidence degrees from large to small to form the current particle swarm;
and determining the current target pose based on the average value of the poses corresponding to all target sampling particles in the current particle swarm or determining the pose corresponding to the target sampling particle with the highest confidence level in all target sampling particles in the current particle swarm as the current target pose.
7. A robotic skid management apparatus, the apparatus comprising:
a synchronization module for synchronizing first motion data acquired by a wheel odometer with second motion data acquired by an inertial measurement unit, the first motion data including a first angular velocity of rotation of the robot, the second motion data including a second angular velocity of rotation of the robot;
the judging module is used for judging whether the robot slips or not according to the difference value between the first rotation angular velocity and the second rotation angular velocity at the same moment;
and the determining module is used for re-determining the current pose of the robot by using a preset algorithm through the point cloud data acquired by the laser radar if the robot slips.
8. A method of robot skid handling, the method comprising:
synchronizing first motion data acquired by a wheel odometer with second motion data acquired by an inertial measurement unit, the first motion data comprising a first angular velocity of rotation of the robot, the second motion data comprising a second angular velocity of rotation of the robot;
judging whether the robot slips or not according to the difference value between the first rotation angular velocity and the second rotation angular velocity at the same moment;
and if the robot slips, re-determining the current pose of the robot by using a preset algorithm through point cloud data acquired by a laser radar.
9. The method of claim 8, wherein if the robot slips, the re-determining the current pose of the robot by using a preset algorithm from the point cloud data obtained by the lidar comprises:
acquiring current point cloud data obtained by scanning of the laser radar and first point cloud data when the robot starts to slip;
calculating displacement data of the robot based on the first point cloud data and the current point cloud data;
acquiring a prediction particle swarm of the robot by utilizing the displacement data and a first particle swarm when the robot starts to skid, wherein the prediction particle swarm comprises a plurality of target sampling particles;
performing confidence evaluation on a plurality of target sampling particles according to the current point cloud data to obtain the confidence of the plurality of target sampling particles;
and acquiring a current particle swarm and outputting the current target pose according to the current possible poses and the confidence degrees corresponding to the target sampling particles.
10. A readable storage medium having stored thereon a computer program for, when executed by a processor, implementing a robot slip handling method as implemented by a robot according to any of claims 1-6.
CN202110878199.8A 2021-07-31 2021-07-31 Robot, robot skid processing method, device and readable storage medium Pending CN115683093A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110878199.8A CN115683093A (en) 2021-07-31 2021-07-31 Robot, robot skid processing method, device and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110878199.8A CN115683093A (en) 2021-07-31 2021-07-31 Robot, robot skid processing method, device and readable storage medium

Publications (1)

Publication Number Publication Date
CN115683093A true CN115683093A (en) 2023-02-03

Family

ID=85059746

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110878199.8A Pending CN115683093A (en) 2021-07-31 2021-07-31 Robot, robot skid processing method, device and readable storage medium

Country Status (1)

Country Link
CN (1) CN115683093A (en)

Similar Documents

Publication Publication Date Title
US10852139B2 (en) Positioning method, positioning device, and robot
US11629964B2 (en) Navigation map updating method and apparatus and robot using the same
EP3627180B1 (en) Sensor calibration method and device, computer device, medium, and vehicle
EP3620823B1 (en) Method and device for detecting precision of internal parameter of laser radar
US10614324B2 (en) Method and apparatus for identifying static obstacle
CN110470333B (en) Calibration method and device of sensor parameters, storage medium and electronic device
CN112179330A (en) Pose determination method and device of mobile equipment
CN113376650A (en) Mobile robot positioning method and device, electronic equipment and storage medium
CN113218408B (en) 2Dslam method and system suitable for multi-sensor fusion of multiple terrains
CN113340334A (en) Sensor calibration method and device for unmanned vehicle and electronic equipment
CN111142514B (en) Robot and obstacle avoidance method and device thereof
CN114280582A (en) Calibration and calibration method and device for laser radar, storage medium and electronic equipment
CN115326051A (en) Positioning method and device based on dynamic scene, robot and medium
CN112965076B (en) Multi-radar positioning system and method for robot
CN112097772B (en) Robot and map construction method and device thereof
CN117590362A (en) Multi-laser radar external parameter calibration method, device and equipment
CN117387593A (en) Repositioning method, repositioning device, electronic equipment and computer readable storage medium
CN116958452A (en) Three-dimensional reconstruction method and system
CN111656404B (en) Image processing method, system and movable platform
CN111380529B (en) Mobile device positioning method, device and system and mobile device
CN114037977B (en) Road vanishing point detection method, device, equipment and storage medium
CN115683093A (en) Robot, robot skid processing method, device and readable storage medium
CN115937449A (en) High-precision map generation method and device, electronic equipment and storage medium
CN115685236A (en) Robot, robot skid processing method, device and readable storage medium
CN113034538B (en) Pose tracking method and device of visual inertial navigation equipment and visual inertial navigation equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination