CN112578369A - Uncertainty estimation method and device, electronic equipment and storage medium - Google Patents

Uncertainty estimation method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112578369A
CN112578369A CN202011595611.7A CN202011595611A CN112578369A CN 112578369 A CN112578369 A CN 112578369A CN 202011595611 A CN202011595611 A CN 202011595611A CN 112578369 A CN112578369 A CN 112578369A
Authority
CN
China
Prior art keywords
variable
distance
detection
determining
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011595611.7A
Other languages
Chinese (zh)
Other versions
CN112578369B (en
Inventor
赵明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Sensetime Lingang Intelligent Technology Co Ltd
Original Assignee
Shanghai Sensetime Lingang Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Sensetime Lingang Intelligent Technology Co Ltd filed Critical Shanghai Sensetime Lingang Intelligent Technology Co Ltd
Priority to CN202011595611.7A priority Critical patent/CN112578369B/en
Publication of CN112578369A publication Critical patent/CN112578369A/en
Application granted granted Critical
Publication of CN112578369B publication Critical patent/CN112578369B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The present disclosure provides an uncertainty estimation method, apparatus, electronic device, and storage medium, wherein the method comprises: determining state variables representing the pose of the sensor equipment; aiming at each detection of the sensor equipment, determining the information quantity contained in the corresponding state variable in the preset degree of freedom based on the size of disturbance on the detection distance of the detection along with the change of the corresponding state variable in the preset degree of freedom; and determining the uncertainty of the detection result obtained by detecting through the sensor equipment according to the information quantity detected for multiple times. According to the method and the device, the uncertainty degree of the detection result of the sensor equipment is evaluated according to the information quantity, and the evaluation accuracy is high.

Description

Uncertainty estimation method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of positioning technologies, and in particular, to an uncertainty estimation method, apparatus, electronic device, and storage medium.
Background
With the development and progress of scientific technology, the cost of the laser radar, which is often used in the military field, is greatly reduced and is widely applied to various technical fields, such as the field of smart vehicles and automobiles, the field of mobile robots, and the like. Taking the application field of the mobile robot as an example, the positioning of the mobile robot can be realized based on the laser radar positioning technology, and the reliable positioning performance is the basis for the robot to complete the subsequent operation task. Therefore, it is necessary to evaluate the positioning reliability of the laser radar in various environments.
Currently, the evaluation process of the laser radar positioning effect is generally as follows: and acquiring point cloud data by using a laser radar, performing point-by-point nearest neighbor association on the point cloud data and a known environment map, and counting the average distance of all associated points to serve as an uncertainty evaluation result of positioning.
However, the average distance determined by the above method is a rough evaluation result, and there is a problem that the evaluation accuracy is low.
Disclosure of Invention
The embodiment of the disclosure at least provides a processing scheme of point cloud data, and the uncertainty degree of the detection result of the sensor equipment is evaluated according to the information quantity, so that the evaluation accuracy is high.
In a first aspect, an embodiment of the present disclosure provides a method for estimating uncertainty, where the method includes:
determining state variables representing the pose of the sensor equipment;
for each detection of the sensor equipment, determining the information content of the corresponding state variable in the preset degree of freedom based on the size of disturbance on the detection distance of the detection along with the change of the corresponding state variable in the preset degree of freedom;
and determining the uncertainty of the detection result obtained by the detection of the sensor equipment according to the information quantity of the multiple detections.
By adopting the uncertainty estimation method, the state variable representing the pose of the sensor equipment can be determined firstly, and then the information quantity representing the uncertainty degree of the detection result of the sensor equipment can be determined based on the change of the detection distance of each detection along with the corresponding state variable in the preset degree of freedom. The information quantity can represent the information size of the state variable used for evaluating the accuracy of the detection result of the sensor equipment to a certain extent, and the larger the information quantity is, the more accurate the corresponding evaluation detection result is.
In one possible implementation, the state variables include: a translational variable and a rotational variable;
the method for determining the information content of the corresponding state variable in the preset degree of freedom based on the size of the disturbance of the change of the detection distance of the detection along with the corresponding state variable in the preset degree of freedom comprises the following steps:
and determining the information quantity contained in the corresponding state variable on the preset degree of freedom based on the partial differential of the detection distance pair corresponding to the translation variable and the rotation variable.
Considering that the partial differential operation may correspond to a change rate of the function value with each independent variable, the larger the change rate, the larger the disturbance received when the detection distance changes with the state variable, and similarly, the smaller the change rate, the smaller the disturbance received correspondingly, so that the magnitude of the disturbance may be determined based on the partial differential of the detection distance with the corresponding translational variable and rotational variable, and the information amount may be determined.
In a possible implementation, the determining, based on the partial differential of the detected distance pair corresponding to the translational variable and the rotational variable, an information amount contained by the corresponding state variable in the preset degree of freedom includes:
constructing an information matrix based on partial differentiation of the corresponding translation variable and rotation variable of the detection distance pair;
and determining the information quantity contained in the corresponding state variable on the preset degree of freedom based on the constructed information matrix.
Here, the information matrix is used as a specific way of uncertainty evaluation, where the information matrix may be first constructed based on partial differentiation, and the information amount included in the state variable in each preset degree of freedom may be determined based on the constructed information matrix, which is simple to operate.
In one possible embodiment, the method further comprises:
acquiring pose information of the sensor equipment;
under the condition that the information content of the corresponding state variable in the preset degree of freedom is determined to be not less than the corresponding preset threshold, according to the negative correlation relation between the covariance matrix and the information matrix, determining that the uncertainty of the detection result obtained by the detection of the sensor equipment meets the preset condition;
and determining the pose information of the sensor equipment as a positioning result of the sensor equipment.
Here, in consideration of a negative correlation between the covariance matrix and the information matrix, and a corresponding relationship between the information amount included in each preset degree of freedom and the information matrix, here, in a case where it is determined that the information amount included in the corresponding state variable in the preset degree of freedom is not less than the corresponding preset threshold (each matrix value of the corresponding covariance matrix is less than the corresponding preset threshold), it may be said that the currently determined detection result is sufficiently accurate to some extent.
In one possible embodiment, the method further comprises:
under the condition that the information content of the corresponding state variable in the preset degree of freedom is determined to be smaller than the information content of the preset threshold, according to the negative correlation relation between the covariance matrix and the information matrix, determining that the uncertainty of the detection result obtained by the detection of the sensor equipment does not accord with the preset condition;
and acquiring positioning results of other positioning equipment, fusing the acquired positioning results with the pose information of the sensor equipment, and determining the positioning results of the sensor equipment.
Similarly, the negative correlation between the covariance matrix and the information matrix and the corresponding relationship between the information content included in each preset degree of freedom and the information matrix are considered, and here, under the condition that the information content included in the corresponding state variable in the preset degree of freedom is determined to be smaller than the corresponding preset threshold (each matrix value corresponding to the covariance matrix is greater than or equal to the corresponding preset threshold), it can be stated to a certain extent that an inaccurate detection result exists, at this time, fusion positioning can be realized based on other positioning results, and the accuracy of the detection result is further improved.
In one possible embodiment, the sensor device comprises a lidar device; the detection distance comprises an observation distance of the laser beam;
the partial differential of the secondary probe distance pair for the corresponding translational and rotational variables is determined as follows:
determining the distance between the laser radar equipment emitting laser points at the time and a local point cloud plane of a pre-constructed environment point cloud map;
constructing the constraint of the distance based on the translation variable, the rotation variable, the observation distance and the normal vector from the laser point to the local point cloud plane corresponding to the emission;
the partial differential of the detected distance pair for the corresponding translational variable and the rotational variable is determined based on the distance constraint.
Here, for the laser radar device as the sensor device, the constraint of the distance including the translational variable and the rotational variable may be established first, and then the partial differential solution is performed by the constraint of the distance, so that the corresponding partial differential is obtained, the solution process is simple, and the evaluation efficiency is improved while the evaluation accuracy of the detection result is ensured.
In a possible embodiment, the constructing the constraint of the distance based on the translation variable, the rotation variable, the observation distance, and the normal vector of the laser point to the local point cloud plane corresponding to the emission includes:
after the product operation is carried out on the rotation variable corresponding to the emission and the observation distance, the sum operation is carried out on the rotation variable corresponding to the emission and the translation variable corresponding to the emission to obtain the coordinate variable of the laser point corresponding to the emission;
performing product operation on the coordinate variable of the laser point corresponding to the emission and the normal vector from the laser point to the local point cloud plane to obtain the distance variable of the emission laser point from the local point cloud plane of the pre-constructed environment point cloud map;
and obtaining a distance difference variable corresponding to the laser emission point at the time based on the distance variable corresponding to the laser emission point at the time and the determined distance, and determining the constraint on the distance difference variable as the constraint on the distance.
Here, it is considered that the distance variable represents the distance information from the laser spot to the corresponding local point cloud plane, and the radar coordinate system referred to when the laser spot is collected and the world coordinate system referred to by the point cloud plane formed by the point cloud map are not the same coordinate system, so that the laser spot and the point cloud plane may be in the same reference space before the distance variable is determined, and here, the laser spot may be converted to the point cloud map to determine the coordinate variable of the laser spot in the point cloud map, and the distance variable may be determined based on the product operation between the normal vector and the coordinate variable.
In a possible embodiment, said determining the partial differential of the secondary probe distance with respect to the corresponding translational variable and said rotational variable based on said distance constraint comprises:
aiming at the secondary emission laser point, determining a first partial differential value of a distance difference variable corresponding to the laser point relative to a translational variable corresponding to the secondary detection distance pair, a second partial differential value of the distance difference variable corresponding to the laser point relative to an observation distance corresponding to the secondary emission, and a third partial differential value of the distance difference variable corresponding to the laser point relative to a rotational variable corresponding to the secondary detection distance pair;
and determining the partial differential of the corresponding translational variable and the rotational variable of the detection distance at the time based on a first preset operational relationship between the first partial differential value and the second partial differential value and a second preset operational relationship between the second partial differential value and the third partial differential value.
In one possible embodiment, the method further comprises:
setting weight for each detection;
and weighting and summing to obtain the uncertainty of the detection result obtained by the detection of the sensor equipment based on the information quantity of the multiple detections and the weight set by each detection.
In the embodiment of the present disclosure, for laser point data acquired by emitting different laser beams, the influence degree on the accuracy of the detection result is different, for example, for laser points in a shorter distance, since target features reflected by the laser points are richer, higher weight can be given, and vice versa, so that the embodiment of the present disclosure can combine the influence degree of multiple laser point detections on the accuracy of the evaluation, and adopt a scheme for determining the uncertainty of the detection result based on a weighted summation manner, so that the accuracy is better.
In a possible embodiment, the setting of the weight for each detection includes:
determining the ratio of the distance between the laser point emitted by the laser radar equipment each time and a local point cloud plane of a pre-constructed environment point cloud map to a preset weight coefficient;
and determining the corresponding weight of the laser point in the time of emission based on the ratio.
In one possible embodiment, the local point cloud plane corresponding to the primary emission laser point is determined according to the following steps:
for each laser point, converting first coordinate information of the laser point in a radar coordinate system into second coordinate information in the environment point cloud map based on a conversion relation between the radar coordinate system and a world coordinate system;
searching a map data point with the distance to the second coordinate information smaller than a preset threshold value from the environment point cloud map;
and determining a plane formed by the searched map data points as the local point cloud plane.
In a second aspect, an embodiment of the present disclosure further provides an apparatus for estimating uncertainty, the apparatus including:
the state determination module is used for determining state variables representing the pose of the sensor equipment;
the information quantity determining module is used for determining the information quantity contained in the corresponding state variable in the preset degree of freedom based on the size of disturbance on the change of the detection distance of the detection in the preset degree of freedom along with the corresponding state variable in each detection of the sensor equipment;
and the uncertainty estimation module is used for determining the uncertainty of the detection result obtained by the detection of the sensor equipment according to the information quantity of the multiple detections.
In a third aspect, an embodiment of the present disclosure further provides an electronic device, including: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor being configured to execute the machine-readable instructions stored in the memory, the processor and the memory communicating via the bus when the electronic device is running, the machine-readable instructions when executed by the processor performing the steps of the method for estimating uncertainty according to the first aspect and any of its various embodiments.
In a fourth aspect, the disclosed embodiments also provide a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by an electronic device, and the electronic device executes the steps of the uncertainty estimation method according to the first aspect and any of the various embodiments of the present invention.
For the description of the effect of the above uncertainty estimation apparatus, electronic device, and computer-readable storage medium, reference is made to the description of the above uncertainty estimation method, which is not repeated here.
In order to make the aforementioned objects, features and advantages of the present disclosure more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for use in the embodiments will be briefly described below, and the drawings herein incorporated in and forming a part of the specification illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the technical solutions of the present disclosure. It is appreciated that the following drawings depict only certain embodiments of the disclosure and are therefore not to be considered limiting of its scope, for those skilled in the art will be able to derive additional related drawings therefrom without the benefit of the inventive faculty.
FIG. 1 is a flow chart illustrating a method for estimating uncertainty according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram illustrating an uncertainty estimation apparatus provided in the second embodiment of the disclosure;
fig. 3 shows a schematic diagram of an electronic device provided in a third embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present disclosure more clear, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, not all of the embodiments. The components of embodiments of the present disclosure, as generally described and illustrated herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure is not intended to limit the scope of the disclosure, as claimed, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the disclosure without making creative efforts, shall fall within the protection scope of the disclosure.
It has been found that currently, the positioning result for the radar device can be generally performed in the following ways for evaluating the positioning uncertainty. Firstly, point-by-point nearest neighbor association is carried out on point cloud data acquired by current radar equipment and a known environment map, and the average distance (such as Euclidean distance, Mahalanobis distance and the like) of all associated points is counted to serve as uncertainty of point cloud matching, so that the defect of the method is obvious, firstly, complete mathematical proof is not provided, the method is generally only used as simple description in engineering, but for scenes with very poor characteristic structures such as tunnels and the like, the calculated average distance of registration is small but actual uncertainty is very high, namely, the evaluation result of the method is inaccurate, so that the application is very limited; secondly, based on a sampling mode such as a particle filter algorithm and the like, Gaussian scattering particles (each particle comprises a pose and a weight) are usually carried out near a current pose, the pose of each particle is subjected to point cloud matching, then a registration weight is calculated, the weight describes the matching condition of the point cloud under the current pose and a known point cloud map, finally, the distribution condition of all the particles is counted, so that the current positioning uncertainty can be calculated, and the mode is usually time-consuming and is usually used for two-dimensional laser radar in two-dimensional space positioning; and thirdly, the method for solving the variable uncertainty in mathematics is deduced and applied to laser radar positioning, for example, evaluation parameters such as entropy and Fisher information matrix are adopted, and the method is theoretically proved to have strict mathematical deduction and can better describe the uncertainty of point cloud positioning.
Based on the research, the method and the device at least provide a processing scheme of point cloud data, the uncertainty degree of the detection result of the sensor device is evaluated according to the information quantity, and the evaluation accuracy is high.
The above-mentioned drawbacks are the results of the inventor after practical and careful study, and therefore, the discovery process of the above-mentioned problems and the solutions proposed by the present disclosure to the above-mentioned problems should be the contribution of the inventor in the process of the present disclosure.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
To facilitate understanding of the present embodiment, a detailed description is first given of an uncertainty estimation method disclosed in the embodiments of the present disclosure, and an execution subject of the uncertainty estimation method provided in the embodiments of the present disclosure is generally an electronic device with certain computing power, where the electronic device includes: a terminal device, which may be a User Equipment (UE), a mobile device, a User terminal, a cellular phone, a cordless phone, a Personal Digital Assistant (PDA), a handheld device, a computing device, a vehicle mounted device, a wearable device, or a server or other processing device. In some possible implementations, the uncertainty estimation method may be implemented by way of a processor invoking computer readable instructions stored in a memory.
The uncertainty estimation method provided by the embodiments of the present disclosure is described below by taking the execution principal server as an example.
Example one
Referring to fig. 1, a flowchart of an uncertainty estimation method provided by the embodiment of the present disclosure is shown, and the method includes steps S101 to S103, where:
s101, determining state variables representing the pose of the sensor equipment;
s102, aiming at each detection of the sensor equipment, determining the information quantity contained in the corresponding state variable in the preset degree of freedom based on the size of disturbance on the detection distance of the detection along with the change of the corresponding state variable in the preset degree of freedom;
and S103, determining the uncertainty of the detection result obtained by detecting through the sensor equipment according to the information quantity detected for multiple times.
Here, in order to facilitate understanding of the uncertainty estimation method provided by the embodiments of the present disclosure, an application scenario of the estimation method may be first described in detail. The uncertainty estimation method provided by the embodiment of the disclosure can be applied to any scene in which the reliability of the detection result of the sensor device needs to be evaluated, for example, to the field of mobile robots, where a reliable positioning detection result is a basic task for the robot to complete a subsequent operation task, and for example, to the field of intelligent vehicles, where a reliable positioning detection result is a premise that a vehicle can be safely driven, and can also be applied to other various fields, and no specific limitation is made herein. In view of the wide application of smart vehicle technology, the following description will be given by taking smart vehicle applications as an example. The intelligent vehicle in the embodiment of the present disclosure may include an automatic driving vehicle, and may also include a manual driving vehicle having a partial intelligent function.
The evaluation of the detection result in the uncertainty estimation method provided by the embodiments of the present disclosure is actually a problem of state estimation. Taking the application of the intelligent vehicle as an example, the pose of the sensor device can be used as a state variable, the detection data of the sensor device can be used as a random variable, and the unbiased estimation on the pose of the sensor device can be carried out under the condition of giving the detection data.
The state variables are used as variables representing the pose of the sensor equipment, and here, the information content of the corresponding state variables in the preset degree of freedom can be determined based on the size of disturbance of the detection distance along with the change of the pose variables in the preset degree of freedom, and the larger the disturbance is, the larger the corresponding information content is, to a certain extent. The larger the amount of information contained in the unknown variable, the smaller the uncertainty of the detection result of the sensor device can be explained to some extent.
The state variables, i.e., (x, y, z, roll, pitch, yaw) in the embodiments of the present disclosure may include not only the translational variables (x, y, z) but also the rotational variables (roll, pitch, yaw). The preset degree of freedom here may include a certain range of variation for each state variable value. The sensor device may be a laser radar device, or may be other devices that require uncertainty evaluation of detection results, such as an image sensor, which is not specifically limited by the embodiments of the present disclosure.
In this way, the corresponding translational variable and rotational variable can be solved for each detection distance, respectively, so that the corresponding information amount can be determined by the solved partial differential. This is mainly because the partial differential can characterize the change rate of a function with the change of a variable to a certain extent, which is consistent with the physical meaning related to the embodiment of the present disclosure regarding the magnitude of the disturbance on the detection distance with the change of the corresponding state variable within the preset degree of freedom, and thus, the embodiment of the present disclosure provides a scheme for determining the information amount based on the partial differential.
In the embodiment of the present disclosure, considering that an information matrix is a common way to solve the uncertain estimation problem, here, an information matrix may be first constructed based on partial differentiation of each detection distance pair on a corresponding translation variable and a corresponding rotation variable, and based on the constructed information matrix, an information amount included in a state variable in each preset degree of freedom may be determined.
It should be noted that the information matrix and the covariance matrix used for representing the uncertainty degree of the detection result of the sensor device are in negative correlation, that is, the smaller the value of the covariance matrix, the larger the amount of information carried is, and the more accurate the estimated detection result is.
Taking the positioning of the sensor device in the three-dimensional space as an example, the determined pose of the sensor device is a 6 × 1 vector, i.e., x, y, z, roll, pitch, yaw, and the covariance matrix may be a 6 × 6 matrix, where 6 elements on the diagonal describe the variance in the 6 degrees of freedom, and the other elements in the matrix correspond to the covariance of the combination of any two degrees of freedom. Each matrix value in the covariance matrix can correspondingly describe the quality of the current detection result corresponding to different degrees of freedom, and the degree of uncertainty of the detection result corresponds to the degree of uncertainty.
Under the condition that the accuracy of the detection result is relatively good, the positioning result can be determined directly based on the pose information of the sensor equipment, and under the condition that the accuracy of the detection result is relatively poor, the final positioning can be realized by combining other positioning modes. Next, the following two aspects can be explained in detail.
In a first aspect: and under the condition that the information quantity contained in the corresponding state variable in the preset degree of freedom is not less than the corresponding preset threshold value, according to the negative correlation relation between the covariance matrix and the information matrix, determining that the uncertainty of the detection result obtained by detecting the sensor equipment meets the preset condition, and determining the pose information of the sensor equipment as the positioning result of the sensor equipment.
Here, still taking the above-mentioned positioning of the smart vehicle in the three-dimensional space as an example, if the variance in a certain degree of freedom is large, for example, the larger the element in the x direction, i.e. the leftmost upper corner of the covariance matrix is, the larger the uncertainty of the directional positioning is, i.e. the less accurate the detection result is. On the contrary, if the variance in a certain degree of freedom is small, it can be said that the detection result is more accurate.
In the embodiment of the disclosure, under the condition that the variance or covariance value corresponding to each degree of freedom is small (corresponding to a large amount of information contained in each degree of freedom), the pose information of the sensor device can be determined accurately, and can be used as a corresponding positioning result.
In a second aspect: and under the condition that the information quantity contained in the corresponding state variable in the preset degree of freedom is determined to be smaller than the information quantity of the preset threshold, according to the negative correlation relation between the covariance matrix and the information matrix, determining that the uncertainty of the detection result obtained by detecting through the sensor equipment does not accord with the preset condition, and the positioning results of other positioning equipment can be obtained, and fusing the obtained positioning results with the pose information of the sensor equipment to determine the positioning result of the sensor equipment.
Here, as an example of the Positioning of the smart vehicle in the three-dimensional space, a Global Positioning System (GPS), an Inertial Measurement Unit (IMU) and a laser radar device are used for Positioning. At a certain moment, the covariance of the output positioning result of the laser radar equipment is large, and the covariance of the other two sensors is small, so that the final positioning output positioning result is likely to be more inclined to the output results of the IMU and the GPS, and the positioning result is more robust.
It should be noted that the fusion may be specifically a fusion of the positioning result of a certain degree of freedom, for example, the positioning result of the lidar on the translational component and the positioning result of the IMU on the rotational component may be fused to obtain the final positioning result under the condition that the positioning of the lidar on the translational component is more accurate and the positioning of the IMU on the rotational component is more accurate.
In the embodiment of the present disclosure, in a case where the laser radar apparatus is used as a sensor apparatus, the pose information of the laser radar apparatus may be determined based on point cloud data currently acquired by the laser radar apparatus and a point cloud map constructed in advance, where the pose information may be determined by a comparison result between a sparse point cloud feature corresponding to the point cloud map and a point cloud feature of the point cloud data currently acquired. That is, when sparse point cloud data with a high matching degree is found from the point cloud map, the pose information of the lidar device that collects the point cloud data can be determined based on the calibration pose information of the lidar device that is calibrated in advance by the sparse point cloud data.
The embodiment of the disclosure can adopt a rotary scanning radar as a laser radar device to collect the point cloud data, and the rotary scanning radar can acquire the point cloud data of the related target in the surrounding environment of the running device during the horizontal direction rotary scanning. In the process of rotary scanning, the radar can adopt a multi-line scanning mode, namely a plurality of laser tubes are used for emitting in sequence, and the structure is that the plurality of laser tubes are longitudinally arranged, namely, in the process of rotary scanning in the horizontal direction, multi-layer scanning in the vertical direction is carried out. A certain included angle is formed between every two laser tubes, the vertical emission view field can be 30-40 degrees, so that a data packet returned by laser beams emitted by the laser tubes can be obtained when the radar equipment rotates for one scanning angle, and the data packets obtained at all the scanning angles are spliced to obtain point cloud data.
In order to achieve positioning related to the lidar device, the point cloud map in the uncertainty estimation method provided by the embodiment of the disclosure may be constructed in advance. The point cloud map here may be a high-precision map. Compared with the traditional electronic map, the drawing of the high-precision map generally needs to use a professional collection vehicle carrying radar equipment to collect data of a region needing to be mapped. And for the acquired point cloud data, constructing a corresponding three-dimensional object model based on object contour labeling processing, and then arranging the constructed three-dimensional object model at a corresponding map position to construct a point cloud map.
Considering the influence of the process of determining the partial differential on the information amount, and the key role of the information amount on the uncertain evaluation of the detection result, here, the following description can be focused on the process of determining the partial differential:
step one, determining the distance between the laser point emitted by the laser radar equipment at the time and a local point cloud plane of a pre-constructed environment point cloud map;
secondly, constructing distance constraint based on a translation variable, a rotation variable, an observation distance and a normal vector from a laser point to a local point cloud plane corresponding to the emission;
and step three, determining partial differential of the corresponding translation variable and rotation variable of the detection distance pair based on the distance constraint.
The uncertainty estimation method provided by the embodiment of the disclosure can determine the positioning result of the sensor device based on the partial differential representing the uncertainty degree of the detection result of the sensor device, wherein the partial differential is used as an evaluation parameter and can be obtained by constructing a correlation constraint solution on the distance from the laser point to the corresponding local point cloud plane.
In the embodiment of the present disclosure, the constraint on the distance may be constructed as follows:
step one, after the product operation is carried out on the rotation variable corresponding to the emission and the observation distance, the product operation is carried out on the rotation variable corresponding to the emission and the translation variable corresponding to the emission, and then the summation operation is carried out on the rotation variable and the translation variable corresponding to the emission, so that the coordinate variable of the laser point corresponding to the emission is obtained;
step two, performing product operation on the coordinate variable of the laser point corresponding to the emission and the normal vector from the laser point to the local point cloud plane to obtain the distance variable of the emission laser point from the local point cloud plane of the pre-constructed environment point cloud map;
and step three, obtaining a distance difference variable corresponding to the laser emission point at the time based on the distance variable corresponding to the laser emission point at the time and the determined distance, and determining the constraint on the distance difference variable as the constraint on the distance.
Here, first, a distance variable from a laser point to a local point cloud plane of a pre-constructed environment point cloud map may be constructed, and then, based on a distance (corresponding to an observation distance) from the laser point to the corresponding local point cloud plane, a distance difference variable corresponding to the laser point may be determined, where the smaller the value corresponding to the distance difference variable, the smaller the difference between the construction distance and the observation distance is, which is a distance constraint condition to be satisfied by the embodiment of the present disclosure.
The distance difference variable is a correlation function carrying pose information of the sensor equipment, so that the accuracy of a detection result can be evaluated by the solved covariance matrix.
To facilitate understanding of the above process for determining the covariance matrix and solving for the distance difference variables, the following equations may be combined.
cov(x)≥[I(x)]-1(ii) a Wherein the content of the first and second substances,
Figure BDA0002870250530000141
where cov (x) is used to represent the covariance matrix, I (x) is used to represent the information matrix, ρiThe distance variable corresponding to the ith laser point is shown, x is used for showing the pose information of the sensor device, and wiIt is used to represent the weight.
It can be known that, when the partial differential corresponding to each detected laser point is determined, the covariance matrix can be obtained through summation operation.
In order to solve the partial differential part in the above formula, the embodiment of the disclosure may construct a distance difference variable corresponding to each laser point, where the distance difference variable is constructed on the premise that a distance variable for representing the laser point to the corresponding local point cloud plane needs to be constructed.
Here, the coordinates of the laser point in the point cloud map in the radar coordinate system may be determined first, and the distance variable may be determined using this coordinate and a normal vector perpendicular to the point cloud plane based on the distance formula.
The coordinates of the laser points in the point cloud map can be obtained by performing product operation on the emission corresponding rotation variable and the observation distance and then performing summation operation on the emission corresponding translation variable.
In a specific application, considering the conversion relationship between the rotation component and the rotation matrix, the rotation variable may be converted into the rotation matrix variable, and then the product operation is performed.
Based on the description content, a distance variable can be constructed, and based on the difference result of the distance variable and the observation distance from the laser point to the corresponding local point cloud plane, the distance difference variable can be determined.
To facilitate understanding of the above-mentioned construction process of the distance difference variable, the following formula can be further described.
Figure BDA0002870250530000152
Wherein the content of the first and second substances,
Figure BDA0002870250530000153
for representing distance difference variables, diFor representing the observation distance of the ith laser point to the corresponding local point cloud plane,
Figure BDA0002870250530000154
normal vector for representing local point cloud plane, X for translation variable, R for rotation variable, RiFor indicating the direction vector, p, of the laser beam from which the laser spot is to be takeniThe distance variable corresponding to the ith laser spot is shown.
Here, the distance difference variable may be expanded using a conversion relationship between the rotation variable and the rotation matrix variable, as shown in the following equation:
Figure BDA0002870250530000151
where θ is used to represent a transformation matrix that can transform the rotation variables into rotation matrix variables, and I is used to represent a unit vector.
In the embodiment of the disclosure, to solve
Figure BDA0002870250530000164
This partial differential value can be determined based on an implicit function constructed by implicit function theory. The method can be realized by the following steps:
step one, aiming at the laser emission point, determining a first partial differential value of a distance difference variable corresponding to the laser point relative to a translation variable corresponding to the detection distance pair, a second partial differential value of the distance difference variable corresponding to the laser point relative to an observation distance corresponding to the emission, and a third partial differential value of the distance difference variable corresponding to the laser point relative to a rotation variable corresponding to the detection distance;
and secondly, determining the partial differential of the detection distance pair corresponding to the translation variable and the rotation variable based on a first preset operation relation between the first partial differential value and the second partial differential value and a second preset operation relation between the second partial differential value and the third partial differential value.
Here, in order to facilitate understanding of the above-described partial differential value determination process, the following may be specifically described in conjunction with the following equations.
Figure BDA0002870250530000161
Wherein the content of the first and second substances,
Figure BDA0002870250530000165
a first partial differential value for representing a distance difference variable corresponding to the ith laser point with respect to the corresponding translation variable for the detection distance pair,
Figure BDA0002870250530000166
a second partial differential value of the distance difference variable corresponding to the ith laser spot with respect to the observation distance corresponding to the shot,
Figure BDA0002870250530000167
and a third partial differential value of the distance difference variable corresponding to the ith laser point relative to the corresponding rotation variable of the detection distance.
By solving the above formula, the partial differential value of the detection distance corresponding to each detection laser point to the pose information of the sensor device can be solved, as follows:
Figure BDA0002870250530000162
Figure BDA0002870250530000163
as can be seen from the above equation, the partial differential values of the detection distance pair corresponding to the translational variable and the rotational variable determined by the uncertainty estimation method provided by the embodiment of the present disclosure are only related to the observation direction corresponding to one laser beam and the normal vector corresponding to the point cloud plane.
In the case where the partial differential value corresponding to each detected laser spot is determined, the weight (i.e., w) corresponding to the laser spot may be determined for each laser spoti) Then, an information matrix can be determined by performing weighted summation on the partial differential values corresponding to the laser points, the information matrix can represent the uncertainty degree of the pose information of the sensor equipment to a certain extent, at the moment, the information matrix and a covariance matrix used for representing the uncertainty degree of the pose information of the sensor equipment are in negative correlation, namely, the smaller the value of the covariance matrix is, the larger the carried information amount is, and the estimated positioning result is more accurate at the moment.
It should be noted that, in the embodiment of the present disclosure, the weight is used to indicate an evaluation influence degree of an uncertainty degree of the pose information of the sensor device by the laser beam that collects the laser spot, where the evaluation influence degree may be determined based on a ratio between a distance from the laser spot to the corresponding local point cloud plane and a preset weight coefficient, and specifically may be implemented by the following formula:
Figure BDA0002870250530000171
wherein d isiThe distance between the ith laser point and the corresponding local point cloud plane is represented, and epsilon is used for representing a preset weight coefficient.
Based on the above formula, the weight w set for the ith detectioniThe description can be the matching condition between the current laser point and the environmental point cloud map, and the closer the distance, the better the matching condition, and the larger the corresponding weight value. More accurate implementation by adapting different weights for different laser pointsAnd (4) performing accurate detection evaluation.
In the embodiment of the present disclosure, the local point cloud plane corresponding to the laser point may be determined based on a point cloud map constructed in advance, and here, after the laser point is mapped to the point cloud map, a plane formed by the map laser point having a sufficiently small distance from the mapped point may be used as the point cloud plane.
In the embodiment of the present disclosure, the local point cloud plane may be determined according to the following steps:
step one, aiming at each laser point, converting first coordinate information of the laser point in a radar coordinate system into second coordinate information in an environment point cloud map based on a conversion relation between the radar coordinate system and a world coordinate system;
searching a map data point with the distance between the map data point and the second coordinate information being smaller than a preset threshold value from the environment point cloud map;
and step three, determining a plane formed by the searched map data points as a local point cloud plane.
Here, the coordinate information may be converted based on a conversion relationship between the radar coordinate system and the world coordinate system, and when the second coordinate information of the laser point in the environment point cloud map is determined, the map data point in the environment point cloud map may be found based on the distance determination result, and the local point cloud plane may be determined.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
Based on the same inventive concept, an uncertainty estimation apparatus corresponding to the uncertainty estimation method is also provided in the embodiments of the present disclosure, and since the principle of the apparatus in the embodiments of the present disclosure for solving the problem is similar to the above uncertainty estimation method in the embodiments of the present disclosure, the implementation of the apparatus may refer to the implementation of the method, and repeated details are not repeated.
Example two
Referring to fig. 2, a schematic diagram of an uncertainty estimation apparatus provided in an embodiment of the present disclosure is shown, the apparatus including: a state determination module 201, an information amount determination module 202 and an uncertainty estimation module 203; wherein the content of the first and second substances,
a state determination module 201, configured to determine state variables representing the pose of the sensor device;
the information quantity determining module 202 is configured to determine, for each detection of the sensor device, an information quantity included in the corresponding state variable in the preset degree of freedom based on a magnitude of disturbance that a detection distance of the detection changes with the corresponding state variable in the preset degree of freedom;
and the uncertainty estimation module 203 is used for determining the uncertainty of the detection result obtained by the detection of the sensor device according to the information quantity of the multiple detections.
According to the method and the device for detecting the sensor equipment, firstly, the state variable representing the pose of the sensor equipment can be determined, and then the information quantity used for representing the uncertainty degree of the detection result of the sensor equipment can be determined based on the change of the detection distance detected each time along with the corresponding state variable in the preset degree of freedom. The information quantity can represent the information size of the state variable used for evaluating the accuracy of the detection result of the sensor equipment to a certain extent, and the larger the information quantity is, the more accurate the corresponding evaluation detection result is.
In one possible implementation, the state variables include: a translational variable and a rotational variable;
an information amount determining module 202, configured to determine, according to the following steps, an information amount included in the preset degree of freedom of the corresponding state variable based on a size of disturbance that the detected distance changes with the corresponding state variable within the preset degree of freedom:
and determining the information quantity contained in the corresponding state variable on the preset degree of freedom based on the partial differential of the detection distance pair corresponding to the translation variable and the rotation variable.
In a possible implementation, the information amount determining module 202 is configured to determine the information amount contained by the corresponding state variable in the preset degree of freedom based on the partial differential of the secondary detection distance pair corresponding to the translational variable and the rotational variable, according to the following steps:
constructing an information matrix based on partial differentiation of the corresponding translation variable and rotation variable of the detection distance pair;
and determining the information quantity contained in the corresponding state variable on the preset degree of freedom based on the constructed information matrix.
In a possible embodiment, the above apparatus further comprises:
the positioning module 204 is used for acquiring pose information of the sensor device; under the condition that the information content of the corresponding state variable on the preset degree of freedom is determined to be not less than the corresponding preset threshold, according to the negative correlation relation between the covariance matrix and the information matrix, determining that the uncertainty of the detection result obtained by the detection of the sensor equipment meets the preset condition; and determining the pose information of the sensor equipment as a positioning result of the sensor equipment.
In a possible implementation, the positioning module 204 is further configured to:
under the condition that the information quantity contained in the corresponding state variable in the preset degree of freedom is determined to be smaller than the information quantity of the preset threshold, according to the negative correlation relation between the covariance matrix and the information matrix, determining that the uncertainty of the detection result obtained by the detection of the sensor equipment does not accord with the preset condition; and acquiring positioning results of other positioning equipment, fusing the acquired positioning results with pose information of the sensor equipment, and determining the positioning results of the sensor equipment.
In one possible embodiment, the sensor device comprises a lidar device; the detection distance comprises an observation distance of the laser beam;
an information quantity determining module 202, configured to determine partial differentials of the detected distance pairs corresponding to the translational variable and the rotational variable in the following manners:
determining the distance between the laser point emitted by the laser radar equipment at this time and a local point cloud plane of a pre-constructed environment point cloud map;
constructing distance constraint based on the translation variable, the rotation variable, the observation distance and the normal vector from the laser point to the local point cloud plane corresponding to the emission;
the distance-based constraint determines the partial differential of the secondary probe distance to the corresponding translational and rotational variables.
In a possible implementation, the information amount determining module 202 is configured to construct a distance constraint based on the translation variable, the rotation variable, the observation distance, and the normal vector from the laser point to the local point cloud plane according to the following steps:
after the product operation is carried out on the rotation variable corresponding to the emission and the observation distance, the sum operation is carried out on the rotation variable corresponding to the emission and the translation variable corresponding to the emission to obtain the coordinate variable of the laser point corresponding to the emission;
performing product operation on the coordinate variable of the laser point corresponding to the emission and the normal vector from the laser point to the local point cloud plane to obtain the distance variable of the emission laser point from the pre-constructed local point cloud plane of the environment point cloud map;
and obtaining a distance difference variable corresponding to the laser emission point at the time based on the distance variable corresponding to the laser emission point at the time and the determined distance, and determining the constraint on the distance difference variable as the constraint on the distance.
In a possible implementation, the information amount determining module 202 is configured to determine the partial differential of the secondary detection distance pair to the corresponding translational variable and rotational variable based on the distance constraint according to the following steps:
aiming at the secondary emission laser point, determining a first partial differential value of a distance difference variable corresponding to the laser point relative to a translational variable corresponding to the secondary detection distance pair, a second partial differential value of the distance difference variable corresponding to the laser point relative to an observation distance corresponding to the secondary emission, and a third partial differential value of the distance difference variable corresponding to the laser point relative to a rotational variable corresponding to the secondary detection distance pair;
and determining the partial differential of the secondary detection distance pair corresponding to the translational variable and the rotational variable based on a first preset operational relationship between the first partial differential value and the second partial differential value and a second preset operational relationship between the second partial differential value and the third partial differential value.
In one possible embodiment, the uncertainty estimation module 203 is configured to determine the uncertainty according to the following steps:
setting weight for each detection;
and based on the information quantity of the multiple detections and the weight set by each detection, weighting and summing to obtain the uncertainty of the detection result obtained by the detection of the sensor equipment.
In one possible implementation, the uncertainty estimation module 203 is configured to set a weight for each detection according to the following steps:
determining the ratio of the distance between a laser point emitted by laser radar equipment each time and a local point cloud plane of a pre-constructed environment point cloud map to a preset weight coefficient;
and determining the corresponding weight of the laser point in the emission based on the ratio.
In a possible implementation, the information amount determining module 202 is configured to determine a local point cloud plane corresponding to a primary emission laser point according to the following steps:
for each laser point, converting first coordinate information of the laser point in a radar coordinate system into second coordinate information in an environment point cloud map based on a conversion relation between the radar coordinate system and a world coordinate system;
searching a map data point with the distance between the map data point and the second coordinate information being smaller than a preset threshold value from the environment point cloud map;
and determining a plane formed by the searched map data points as a local point cloud plane.
The description of the processing flow of each module in the device and the interaction flow between the modules may refer to the related description in the above method embodiments, and will not be described in detail here.
EXAMPLE III
An embodiment of the present disclosure further provides an electronic device, as shown in fig. 3, which is a schematic structural diagram of the electronic device provided in the embodiment of the present disclosure, and the electronic device includes: a processor 301, a memory 302, and a bus 303. The memory 302 stores machine-readable instructions executable by the processor 301 (for example, execution instructions corresponding to the state determining module 201, the information amount determining module 202, the uncertainty estimating module 203, and the like in the apparatus in fig. 2), when the electronic device is operated, the processor 301 communicates with the memory 302 through the bus 303, and when the processor 301 executes the following processing:
determining state variables representing the pose of the sensor equipment;
aiming at each detection of the sensor equipment, determining the information quantity contained in the corresponding state variable in the preset degree of freedom based on the size of disturbance on the detection distance of the detection along with the change of the corresponding state variable in the preset degree of freedom;
and determining the uncertainty of the detection result obtained by detecting through the sensor equipment according to the information quantity detected for multiple times.
For the specific execution process of the instruction, reference may be made to the steps of the uncertainty estimation method in the embodiments of the present disclosure, which are not described herein again.
The embodiments of the present disclosure also provide a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, performs the steps of the uncertainty estimation method described in the above method embodiments. The storage medium may be a volatile or non-volatile computer-readable storage medium.
The computer program product of the uncertainty estimation method provided in the embodiments of the present disclosure includes a computer-readable storage medium storing a program code, where instructions included in the program code may be used to execute the steps of the uncertainty estimation method described in the above method embodiments, which may be referred to in the above method embodiments specifically, and are not described herein again.
The embodiments of the present disclosure also provide a computer program, which when executed by a processor implements any one of the methods of the foregoing embodiments. The computer program product may be embodied in hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied in a computer storage medium, and in another alternative embodiment, the computer program product is embodied in a Software product, such as a Software Development Kit (SDK), or the like.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed system, apparatus, and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing an electronic device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present disclosure. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Finally, it should be noted that: the above-mentioned embodiments are merely specific embodiments of the present disclosure, which are used for illustrating the technical solutions of the present disclosure and not for limiting the same, and the scope of the present disclosure is not limited thereto, and although the present disclosure is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive of the technical solutions described in the foregoing embodiments or equivalent technical features thereof within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present disclosure, and should be construed as being included therein. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (14)

1. A method of estimating uncertainty, the method comprising:
determining state variables representing the pose of the sensor equipment;
for each detection of the sensor equipment, determining the information content of the corresponding state variable in the preset degree of freedom based on the size of disturbance on the detection distance of the detection along with the change of the corresponding state variable in the preset degree of freedom;
and determining the uncertainty of the detection result obtained by the detection of the sensor equipment according to the information quantity of the multiple detections.
2. The estimation method according to claim 1, characterized in that the state variables comprise: a translational variable and a rotational variable;
the method for determining the information content of the corresponding state variable in the preset degree of freedom based on the size of the disturbance of the change of the detection distance of the detection along with the corresponding state variable in the preset degree of freedom comprises the following steps:
and determining the information quantity contained in the corresponding state variable on the preset degree of freedom based on the partial differential of the detection distance pair corresponding to the translation variable and the rotation variable.
3. The estimation method according to claim 1 or 2, wherein the determining the information amount contained by the corresponding state variable in the preset degree of freedom based on the partial differential of the secondary detection distance pair corresponding to the translational variable and the rotational variable comprises:
constructing an information matrix based on partial differentiation of the corresponding translation variable and rotation variable of the detection distance pair;
and determining the information quantity contained in the corresponding state variable on the preset degree of freedom based on the constructed information matrix.
4. The estimation method according to claim 3, characterized in that the method further comprises:
acquiring pose information of the sensor equipment;
under the condition that the information content of the corresponding state variable in the preset degree of freedom is determined to be not less than the corresponding preset threshold, according to the negative correlation relation between the covariance matrix and the information matrix, determining that the uncertainty of the detection result obtained by the detection of the sensor equipment meets the preset condition;
and determining the pose information of the sensor equipment as a positioning result of the sensor equipment.
5. The estimation method according to claim 4, characterized in that the method further comprises:
under the condition that the information content of the corresponding state variable in the preset degree of freedom is determined to be smaller than the information content of the preset threshold, according to the negative correlation relation between the covariance matrix and the information matrix, determining that the uncertainty of the detection result obtained by the detection of the sensor equipment does not accord with the preset condition;
and acquiring positioning results of other positioning equipment, fusing the acquired positioning results with the pose information of the sensor equipment, and determining the positioning results of the sensor equipment.
6. The estimation method according to claim 2, characterized in that the sensor device comprises a lidar device; the detection distance comprises an observation distance of the laser beam;
the partial differential of the secondary probe distance pair for the corresponding translational and rotational variables is determined as follows:
determining the distance between the laser radar equipment emitting laser points at the time and a local point cloud plane of a pre-constructed environment point cloud map;
constructing the constraint of the distance based on the translation variable, the rotation variable, the observation distance and the normal vector from the laser point to the local point cloud plane corresponding to the emission;
the partial differential of the detected distance pair for the corresponding translational variable and the rotational variable is determined based on the distance constraint.
7. The estimation method according to claim 6, wherein the constructing the constraint of the distance based on the translation variable, the rotation variable, the observation distance, and the normal vector of the laser point to the local point cloud plane corresponding to the emission comprises:
after the product operation is carried out on the rotation variable corresponding to the emission and the observation distance, the sum operation is carried out on the rotation variable corresponding to the emission and the translation variable corresponding to the emission to obtain the coordinate variable of the laser point corresponding to the emission;
performing product operation on the coordinate variable of the laser point corresponding to the emission and the normal vector from the laser point to the local point cloud plane to obtain the distance variable of the emission laser point from the local point cloud plane of the pre-constructed environment point cloud map;
and obtaining a distance difference variable corresponding to the laser emission point at the time based on the distance variable corresponding to the laser emission point at the time and the determined distance, and determining the constraint on the distance difference variable as the constraint on the distance.
8. The estimation method according to claim 7, wherein said determining the partial differential of the detected distance pair for the corresponding translational variable and the rotational variable based on the distance constraint comprises:
aiming at the secondary emission laser point, determining a first partial differential value of a distance difference variable corresponding to the laser point relative to a translational variable corresponding to the secondary detection distance pair, a second partial differential value of the distance difference variable corresponding to the laser point relative to an observation distance corresponding to the secondary emission, and a third partial differential value of the distance difference variable corresponding to the laser point relative to a rotational variable corresponding to the secondary detection distance pair;
and determining the partial differential of the corresponding translational variable and the rotational variable of the detection distance at the time based on a first preset operational relationship between the first partial differential value and the second partial differential value and a second preset operational relationship between the second partial differential value and the third partial differential value.
9. The estimation method according to claim 3, further comprising:
setting weight for each detection;
and weighting and summing to obtain the uncertainty of the detection result obtained by the detection of the sensor equipment based on the information quantity of the multiple detections and the weight set by each detection.
10. The estimation method according to claim 9, wherein the setting of the weight for each detection comprises:
determining the ratio of the distance between the laser point emitted by the laser radar equipment each time and a local point cloud plane of a pre-constructed environment point cloud map to a preset weight coefficient;
and determining the corresponding weight of the laser point in the time of emission based on the ratio.
11. The estimation method according to any one of claims 6 to 8 and 10, wherein the local point cloud plane corresponding to the primary emission laser point is determined according to the following steps:
for each laser point, converting first coordinate information of the laser point in a radar coordinate system into second coordinate information in the environment point cloud map based on a conversion relation between the radar coordinate system and a world coordinate system;
searching a map data point with the distance to the second coordinate information smaller than a preset threshold value from the environment point cloud map;
and determining a plane formed by the searched map data points as the local point cloud plane.
12. An apparatus for estimating uncertainty, the apparatus comprising:
the state determination module is used for determining state variables representing the pose of the sensor equipment;
the information quantity determining module is used for determining the information quantity contained in the corresponding state variable in the preset degree of freedom based on the size of disturbance on the change of the detection distance of the detection in the preset degree of freedom along with the corresponding state variable in each detection of the sensor equipment;
and the uncertainty estimation module is used for determining the uncertainty of the detection result obtained by the detection of the sensor equipment according to the information quantity of the multiple detections.
13. An electronic device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor being configured to execute the machine-readable instructions stored in the memory, the processor and the memory communicating via the bus when the electronic device is running, the machine-readable instructions when executed by the processor performing the steps of the uncertainty estimation method according to any one of claims 1 to 11.
14. A computer-readable storage medium, characterized in that it has stored thereon a computer program which, when being executed by an electronic device, carries out the steps of the uncertainty estimation method according to any one of claims 1 to 11.
CN202011595611.7A 2020-12-29 2020-12-29 Uncertainty estimation method and device, electronic equipment and storage medium Active CN112578369B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011595611.7A CN112578369B (en) 2020-12-29 2020-12-29 Uncertainty estimation method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011595611.7A CN112578369B (en) 2020-12-29 2020-12-29 Uncertainty estimation method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112578369A true CN112578369A (en) 2021-03-30
CN112578369B CN112578369B (en) 2024-04-12

Family

ID=75144395

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011595611.7A Active CN112578369B (en) 2020-12-29 2020-12-29 Uncertainty estimation method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112578369B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116740197A (en) * 2023-08-11 2023-09-12 之江实验室 External parameter calibration method and device, storage medium and electronic equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9715287D0 (en) * 1997-07-22 1997-09-24 Baroid Technology Inc Improvements in or relating to aided inertial navigation systems
CN102982248A (en) * 2012-12-12 2013-03-20 北京理工大学 Sequential land form overlapped view field estimation method based on linear matrix inequality
CN105758408A (en) * 2016-01-05 2016-07-13 福州华鹰重工机械有限公司 Method and device for building local maps

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9715287D0 (en) * 1997-07-22 1997-09-24 Baroid Technology Inc Improvements in or relating to aided inertial navigation systems
CN102982248A (en) * 2012-12-12 2013-03-20 北京理工大学 Sequential land form overlapped view field estimation method based on linear matrix inequality
CN105758408A (en) * 2016-01-05 2016-07-13 福州华鹰重工机械有限公司 Method and device for building local maps

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116740197A (en) * 2023-08-11 2023-09-12 之江实验室 External parameter calibration method and device, storage medium and electronic equipment
CN116740197B (en) * 2023-08-11 2023-11-21 之江实验室 External parameter calibration method and device, storage medium and electronic equipment

Also Published As

Publication number Publication date
CN112578369B (en) 2024-04-12

Similar Documents

Publication Publication Date Title
Segal et al. Generalized-icp.
CN111121754A (en) Mobile robot positioning navigation method and device, mobile robot and storage medium
CN110470333B (en) Calibration method and device of sensor parameters, storage medium and electronic device
CN113592989A (en) Three-dimensional scene reconstruction system, method, equipment and storage medium
CN111709988B (en) Method and device for determining characteristic information of object, electronic equipment and storage medium
JP2014523572A (en) Generating map data
CN112465877B (en) Kalman filtering visual tracking stabilization method based on motion state estimation
Taylor et al. Automatic calibration of multi-modal sensor systems using a gradient orientation measure
CN113970922A (en) Point cloud data processing method and intelligent driving control method and device
CN112946591A (en) External parameter calibration method and device, electronic equipment and storage medium
CN112284376A (en) Mobile robot indoor positioning mapping method based on multi-sensor fusion
CN113111513B (en) Sensor configuration scheme determining method and device, computer equipment and storage medium
CN115451948A (en) Agricultural unmanned vehicle positioning odometer method and system based on multi-sensor fusion
CN112907746A (en) Method and device for generating electronic map, electronic equipment and storage medium
CN113822944B (en) External parameter calibration method and device, electronic equipment and storage medium
CN112578369A (en) Uncertainty estimation method and device, electronic equipment and storage medium
CN113222042A (en) Evaluation method, evaluation device, electronic equipment and storage medium
CN117274255A (en) Data detection method, device, electronic equipment and storage medium
CN113495281B (en) Real-time positioning method and device for movable platform
CN115060268A (en) Fusion positioning method, system, equipment and storage medium for machine room
Liu et al. Collaborative radio SLAM for multiple robots based on WiFi fingerprint similarity
Li et al. A single-shot pose estimation approach for a 2D laser rangefinder
CN111708046A (en) Method and device for processing plane data of obstacle, electronic equipment and storage medium
CN113029166B (en) Positioning method, positioning device, electronic equipment and storage medium
CN115077467B (en) Cleaning robot posture estimation method and device and cleaning robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant