CN116337072A - Construction method, construction equipment and readable storage medium for engineering machinery - Google Patents
Construction method, construction equipment and readable storage medium for engineering machinery Download PDFInfo
- Publication number
- CN116337072A CN116337072A CN202310296460.2A CN202310296460A CN116337072A CN 116337072 A CN116337072 A CN 116337072A CN 202310296460 A CN202310296460 A CN 202310296460A CN 116337072 A CN116337072 A CN 116337072A
- Authority
- CN
- China
- Prior art keywords
- point cloud
- current frame
- point
- laser
- imu
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000010276 construction Methods 0.000 title claims abstract description 27
- 238000003860 storage Methods 0.000 title claims abstract description 18
- 230000009466 transformation Effects 0.000 claims abstract description 69
- 238000005457 optimization Methods 0.000 claims abstract description 29
- 238000009826 distribution Methods 0.000 claims abstract description 26
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 23
- 238000001514 detection method Methods 0.000 claims abstract description 16
- 239000011159 matrix material Substances 0.000 claims description 74
- 238000004364 calculation method Methods 0.000 claims description 34
- 238000013507 mapping Methods 0.000 claims description 29
- 230000006870 function Effects 0.000 claims description 27
- 238000000034 method Methods 0.000 claims description 22
- 238000004590 computer program Methods 0.000 claims description 21
- 238000007476 Maximum Likelihood Methods 0.000 claims description 18
- 238000012937 correction Methods 0.000 claims description 15
- 230000004927 fusion Effects 0.000 claims description 9
- 238000013519 translation Methods 0.000 claims description 8
- 230000001815 facial effect Effects 0.000 claims description 6
- 230000010354 integration Effects 0.000 claims description 6
- 230000001131 transforming effect Effects 0.000 claims description 6
- 230000007613 environmental effect Effects 0.000 claims description 4
- 230000003247 decreasing effect Effects 0.000 claims description 3
- 238000000605 extraction Methods 0.000 claims description 3
- 238000001914 filtration Methods 0.000 claims description 3
- 238000005259 measurement Methods 0.000 claims description 3
- 230000015556 catabolic process Effects 0.000 abstract description 6
- 238000006731 degradation reaction Methods 0.000 abstract description 6
- 230000004044 response Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 206010063385 Intellectualisation Diseases 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
- G01C21/1652—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4802—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Image Processing (AREA)
Abstract
The invention provides a construction method, equipment and readable storage medium of engineering machinery, which have the technical scheme that IMU data and an IMU odometer are utilized to correct motion distortion of a laser point cloud of a current frame, point line and point surface characteristics are extracted according to each point curvature of the laser point cloud of the current frame, a local map is matched with the laser point cloud of the current frame, and factor map optimization is carried out on a key frame. And correcting the zero offset of the IMU according to the output laser odometer, resetting the IMU pre-integrator, and outputting the IMU odometer. When the current frame laser point cloud meets loop detection conditions, a normal distribution transformation algorithm is utilized to carry out loop detection on the current frame laser point cloud, and the global pose is updated, so that the problem of image building under the degradation scene of engineering machinery is solved, and quick image building and positioning under a large scene are realized.
Description
Technical Field
The present invention relates to the field of motion mapping, and in particular, to mapping, method, apparatus, and readable storage medium for engineering machinery.
Background
Engineering machinery has wide application in the engineering construction field, and along with the continuous development of global science and technology, engineering machinery intellectualization has become a future development trend. Engineering machine intelligence includes product and construction intelligence, manufacturing intelligence, service intelligence and management intelligence, and engineering machine unmanned belongs to product and construction intelligence category, and it means engineering machine can independently accomplish corresponding walking and operation task under unmanned condition.
The unmanned engineering machinery is generally considered to comprise five modules of environment sensing, whole vehicle positioning, decision making, motion control and automatic operation. The whole vehicle positioning is a basis for realizing decision planning, motion control and automatic operation. Different from a common passenger car, the engineering machinery has complex working condition and severe working environment, and a single sensor can not meet the positioning requirement of the engineering machinery on high precision and high robustness of the whole car.
At present, the unmanned whole vehicle technology is mainly applied to the field of automobiles, and related technical researches in the field of engineering machinery are relatively few. The existing unmanned engineering machinery whole vehicle positioning technology mainly uses single sensor positioning, and has the problems of low map building speed, easy degradation of map building, low positioning robustness and the like.
In view of this, the present application is presented.
Disclosure of Invention
The invention discloses a construction method, a construction device and a readable storage medium of engineering machinery, which aim to solve the problems of low construction speed, easy degradation of construction, low positioning robustness and the like of the engineering machinery
The first embodiment of the invention provides a construction method of engineering machinery, which comprises the following steps:
acquiring environment information acquired by a vehicle-mounted multi-sensor fusion platform, and correcting motion distortion of a current frame laser point cloud of the environment information by using IMU data and an IMU odometer;
extracting point-line characteristics and point-plane characteristics according to the curvature of each point of the corrected laser point cloud of the current frame, and matching the point-line characteristics and the point-plane characteristics with a local map;
judging whether the laser point cloud of the current frame is a key frame or not;
if not, correcting the zero offset of the IMU, and resetting the IMU pre-integrator;
if so, performing factor graph optimization on the laser point cloud of the current frame, registering a local map formed by the laser point cloud of the current frame and a historical frame by using a normal distribution transformation algorithm, and optimizing and correcting the global pose.
Preferably, the acquiring the environmental information acquired by the vehicle-mounted multi-sensor fusion platform, and correcting the motion distortion of the current frame laser point cloud of the environmental information by using the IMU data and the IMU odometer, specifically comprises:
Pre-integrating the IMU data to obtain a coordinate transformation matrix of the laser radar from the end scanning moment to the start scanning moment of the current frame of laser point cloud;
performing motion distortion correction on the laser point cloud of the current frame by using the coordinate transformation matrix; wherein, the coordinate transformation matrix is:
wherein T is i k For the current frame excitationThe coordinate transformation matrix of each point of the point cloud relative to the starting scanning moment,a coordinate transformation matrix t of the laser radar from the scanning ending moment to the scanning starting moment i For the measurement time t of each point of the laser point cloud of the current frame in the scanning period k To scan the starting time, t k+1 For the moment of termination of scanning->The coordinates of each point of the laser point cloud of the current frame under the laser radar coordinate system at the initial scanning moment are P i And the coordinates of each point of the laser point cloud of the current frame under the laser radar coordinate system at the current moment.
Preferably, the extracting of the point line feature and the point-plane feature according to the curvature of each point of the corrected laser point cloud of the current frame, and matching the point line feature and the point-plane feature with a local map specifically includes:
and calling a curvature calculation model to generate the curvature of each point of the laser point cloud of the current frame, wherein the curvature calculation model is specifically as follows:
Wherein K is the curvature of the laser spot Yun Di i point, d i D is the distance from the ith point to the origin of a laser radar coordinate system i+j(j=-5,-4,...,5) The distance between the five laser points before and after the ith point and the origin of the laser radar coordinate system;
feature extraction and division are carried out on the laser point cloud of the current frame according to a preset curvature threshold value, points with curvature of each point of the laser point cloud of the current frame being larger than the curvature threshold value are judged to be corner points, and otherwise, the points are judged to be face points;
voxel filtering is carried out on the corner points and the face points, and a characteristic point cloud of the current frame is generated;
searching a point closest to the corner feature in the feature point cloud of the current frame in the local map by using a kd tree, connecting the points into a line, and generating the distance from the corner in the feature point cloud of the current frame to the line, wherein the calculation model is specifically as follows:
wherein,,is a coordinate point of the ith corner point in the characteristic point cloud of the current frame at the moment k+1 under the world coordinate system, and is->And->Coordinate points of the ith corner point under two world coordinate systems on the corresponding boundary line features in the local map;
for the facial point characteristics in the characteristic point cloud of the current frame, searching five points closest to the facial point characteristics in the local map by using a kd tree, and constructing the following plane equation:
Ax+By+Cz+D=0;
Wherein A, B, C, D is a plane equation coefficient, and x, y and z are five-point coordinates closest to the characteristic point cloud surface point characteristic of the current frame in the local map;
substituting five-point coordinates corresponding to the characteristic point cloud surface point characteristics of the current frame in the local map into a plane equation to obtain coefficients of the plane equation, solving the distance from the surface point of the characteristic point cloud of the current frame to the characteristic plane of the local map according to the coefficients of the plane equation, and calculating the model as follows:
wherein x= [ x ] p ,y p ,z p ] T Coordinates of the face points in the characteristic point cloud of the current frame in a world coordinate system;
according to the distance between the laser point cloud point of the current frame and the boundary characteristic corresponding to the local map and the distance between the laser point cloud point of the current frame and the plane characteristic corresponding to the local map, constructing the following residual equation:
wherein,,for the coordinates of the current frame feature point Yun Di i point in the world coordinate system, +.>For the coordinates of the current frame characteristic point Yun Di i point in the laser radar coordinate system, +.>Transforming the current frame characteristic point Yun Di i point from a laser radar coordinate system to a coordinate transformation matrix under a world coordinate system, wherein m is a local map under the world coordinate system, and R and t are respectively a rotation matrix and a translation matrix of the current frame characteristic point Yun Di i point from the laser radar coordinate system to the world coordinate system;
And constructing the following iterative equation by utilizing the residual equation and combining a Levenberg algorithm:
(J T J+μI)·VT=-J T f
wherein J is a jacobian matrix of a residual equation relative to iteration variables, mu is a diagonal update operator, I is an identity matrix, deltaT is a single iteration variable increment, f is the residual equation,for the coordinates of the current frame characteristic point Yun Di i point in the laser radar coordinate system, +.>Transforming the current frame characteristic point Yun Di i point from a laser radar coordinate system to a coordinate transformation matrix under a world coordinate system, wherein m is a local map under the world coordinate system;
the residual sum of feature matching is calculated, and the calculation model is as follows:
wherein d εi D, regarding the residual value of the ith corner point of the characteristic point cloud of the current frame and the corresponding edge line characteristic of the local map as the residual value of d Hj The j-th face point of the characteristic point cloud of the current frame and the residual value of the plane characteristic corresponding to the local map are respectively the number of corner points and face points in the characteristic point cloud of the current frame;
and updating the diagonal updating operator according to the residual sum of the feature matching, and increasing the diagonal operator when the residual sum is overlarge, otherwise, decreasing the diagonal operator.
Preferably, the correction of the zero offset of the IMU is performed, and the IMU pre-integrator is reset, specifically:
defining a pre-integrator for optimizing zero offset of the IMU and a pre-integrator for calculating the IMU odometer respectively;
Acquiring a laser odometer, updating a factor graph, judging whether a factor tree in the updated factor graph exceeds a preset value, and reconstructing the factor graph if the factor tree exceeds the preset value;
adding an IMU original data pre-integral factor, a zero offset factor and a current frame laser odometer factor between the current frame laser point cloud and the previous frame laser point cloud into the updated factor graph;
taking the pose and the speed estimated by the pre-integrator for optimizing the IMU zero offset as the estimated values of the current frame pose factor and the current frame rate factor, taking the zero offset optimized by the previous frame as the estimated value of the current frame zero offset factor, and performing factor graph optimization to generate the speed and the zero offset after optimization and updating;
resetting a pre-integrator for optimizing zero offset of the IMU by using the zero offset after optimization and updating;
checking the speed and zero offset after optimization updating, and reconstructing a factor graph to optimize if the speed and zero offset do not meet the checking requirement;
resetting the IMU mileometer pre-integrator by using the optimized and updated zero offset, and pre-integrating IMU original data after the current frame laser point cloud moment to obtain a pre-integration result;
and estimating the state of the current frame laser point cloud moment by utilizing the state of the current frame laser point cloud moment and IMU zero offset obtained through factor graph optimization and combining the pre-integration result, and outputting the state as a high-frequency odometer.
Preferably, the factor map optimization is performed on the laser point cloud of the current previous frame, and the local map formed by the laser point cloud of the current frame and the historical frame is registered by using a normal distribution transformation algorithm, so as to optimize and correct the global pose, which is specifically as follows:
when judging that the laser point cloud of the current frame is a key frame, calling a loop detection thread based on a laser odometer;
judging whether a history key frame similar to the current key frame exists in a history key frame set according to the loop detection thread;
if so, judging whether the time stamp interval between the adjacent historical key frame and the current key frame is larger than a threshold value, and if so, matching the current key frame with the adjacent historical key frame by using a normal distribution transformation algorithm to generate a relative coordinate transformation matrix;
and optimizing and correcting the global pose by using the relative coordinate transformation matrix.
Preferably, the matching is performed on the current key frame and the similar historical key frame by using a normal distribution transformation algorithm, so as to generate a relative coordinate transformation matrix, which specifically includes:
and rasterizing the similar historical key frames serving as target point clouds, and calculating probability density functions of laser point cloud distribution of each grid under the condition that each grid accords with normal distribution, wherein a calculation model is as follows:
Wherein D is the dimension of the vector,the mean value vector is the covariance matrix of the random vector;
putting the target point cloud into each grid, and calculating the mean value of laser point cloud data in each grid, wherein a calculation model is as follows:
and calculating covariance of point cloud data in each grid by using a covariance calculation formula, wherein a calculation model is as follows:
the current key frame is put into each grid to be used as a source point cloud registered by a normal distribution transformation algorithm point cloud, the probability of each point of the source point cloud in each grid is calculated, and a calculation model body is as follows:
multiplying the probabilities of each point of the source point cloud on each grid to obtain a maximum likelihood function, and when the maximum likelihood function reaches the maximum value, the relative coordinate transformation matrix at the moment is the optimal solution, and the calculation model body is as follows:
wherein R is 3×3 For a rotation matrix of a source point cloud relative to a target point cloud, t 3×1 A translation matrix of the source point cloud relative to the target point cloud, T is a coordinate transformation matrix of the source point cloud relative to the target point cloud,for the coordinates of the points of the source point cloud in the target point cloud coordinate system,/for the coordinates of the points of the source point cloud in the target point cloud coordinate system>The coordinates of each point of the source point cloud under the self coordinate system;
the logarithm is taken from the two ends of the maximum likelihood function, and the model body is calculated as follows:
Solving the maximum value of the maximum likelihood function by utilizing a Newton iteration method, taking a coordinate transformation matrix of a source point cloud relative to a target point cloud as an iteration variable, and solving the coordinate transformation matrix corresponding to the maximum value of the maximum likelihood function by continuously updating the iteration variable along the gradient descending direction, wherein a calculation model body is as follows:
T=T+ΔT
wherein H is the maximum likelihood functionIs provided with a black matrix of the matrix,and T is a coordinate transformation matrix of the source point cloud relative to the target point cloud.
The second embodiment of the present invention provides a mapping device for an engineering machine, including:
the distortion correction unit is used for acquiring the environment information acquired by the vehicle-mounted multi-sensor fusion platform and carrying out motion distortion correction on the current frame laser point cloud of the environment information by utilizing the IMU data and the IMU odometer;
the map matching unit is used for extracting point-line characteristics and point-plane characteristics according to the curvatures of the corrected points of the laser point cloud of the current frame and matching the point-line characteristics and the point-plane characteristics with a local map;
the judging unit is used for judging whether the laser point cloud of the current frame is a key frame or not;
the IMU correction unit is used for correcting zero offset of the IMU and resetting the IMU pre-integrator;
And the global optimization unit is used for performing factor graph optimization on the laser point cloud of the current frame, registering a local map formed by the laser point cloud of the current frame and the historical frame by utilizing a normal distribution transformation algorithm, and optimizing and correcting the global pose.
A third embodiment of the present invention provides a mapping apparatus for an engineering machine, including a memory and a processor, where the memory stores a computer program, and the computer program is capable of being executed by the processor to implement a mapping method for an engineering machine as set forth in any one of the above.
A fourth embodiment of the present invention provides a computer readable storage medium, in which a computer program is stored, where the computer program can be executed by a processor of an apparatus in which the computer readable storage medium is located, to implement a mapping method of an engineering machine according to any one of the above.
The key points of the technical scheme are that IMU data and an IMU odometer are utilized to correct motion distortion of a current frame laser point cloud, point line and point surface characteristics are extracted according to the curvature of each point of the current frame laser point cloud, a local map is matched with the current frame laser point cloud, and factor map optimization is carried out on a key frame. And correcting the zero offset of the IMU according to the output laser odometer, resetting the IMU pre-integrator, and outputting the IMU odometer. When the current frame laser point cloud meets loop detection conditions, a normal distribution transformation algorithm is utilized to carry out loop detection on the current frame laser point cloud, and the global pose is updated, so that the problem of image building under the degradation scene of engineering machinery is solved, and quick image building and positioning under a large scene are realized.
Drawings
FIG. 1 is a schematic flow chart of a mapping method of an engineering machine according to a first embodiment of the present invention;
fig. 2 is a schematic block diagram of a mapping device of a construction machine according to a second embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
For a better understanding of the technical solution of the present invention, the following detailed description of the embodiments of the present invention refers to the accompanying drawings.
It should be understood that the described embodiments are merely some, but not all, embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The terminology used in the embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in this application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be understood that the term "and/or" as used herein is merely one relationship describing the association of the associated objects, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship.
Depending on the context, the word "if" as used herein may be interpreted as "at … …" or "at … …" or "in response to a determination" or "in response to detection". Similarly, the phrase "if determined" or "if detected (stated condition or event)" may be interpreted as "when determined" or "in response to determination" or "when detected (stated condition or event)" or "in response to detection (stated condition or event), depending on the context.
References to "first\second" in the embodiments are merely to distinguish similar objects and do not represent a particular ordering for the objects, it being understood that "first\second" may interchange a particular order or precedence where allowed. It is to be understood that the "first\second" distinguishing objects may be interchanged where appropriate to enable the embodiments described herein to be implemented in sequences other than those illustrated or described herein.
Specific embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
The invention discloses a construction method, a construction device and a readable storage medium of engineering machinery, which aim to solve the problems of low construction speed, easy degradation of construction, low positioning robustness and the like of the engineering machinery
Referring to fig. 1, a first embodiment of the present invention provides a mapping method of a construction machine, which may be executed by a mapping device of the construction machine, and in particular, by one or more processors in the mapping device, to at least implement the following steps:
s101, acquiring environment information acquired by a vehicle-mounted multi-sensor fusion platform, and correcting motion distortion of a current frame laser point cloud of the environment information by using IMU data and an IMU odometer;
in this embodiment, the mapping device may be a terminal with data processing and analysis capabilities, such as a desktop computer, a notebook computer, a server, a workstation, etc., where a corresponding operating system and application software may be installed in the mapping device, and the functions required in this embodiment are implemented by combining the operating system and the application software; the multi-sensor fusion platform can comprise sensing equipment such as an IMU sensor, a laser radar, a laser odometer and the like.
It should be noted that, the laser radar one-frame point cloud data is obtained by scanning one circle (360 °) by the laser radar, but the whole vehicle also moves at a high speed in the period of scanning one circle. This results in inaccurate values of the point cloud coordinates obtained at different scan angles in the scan start coordinate system;
specifically, in this embodiment, the IMU data may be pre-integrated first to obtain a coordinate transformation matrix of the laser radar from the end scanning time to the start scanning time of the current frame of the laser point cloud, where the transformation matrix corresponds to a rotation amount and a translation amount of the whole vehicle relative to the start scanning time of each scanning time, and then the rotation and translation amounts are converted into a homogeneous coordinate transformation matrix of 4*4.
Performing motion distortion correction on the laser point cloud of the current frame by using the coordinate transformation matrix; wherein, the coordinate transformation matrix is:
wherein T is i k For the current frameThe coordinate transformation matrix of each point of the laser point cloud relative to the initial scanning moment,a coordinate transformation matrix t of the laser radar from the scanning ending moment to the scanning starting moment i For the measurement time t of each point of the laser point cloud of the current frame in the scanning period k To scan the starting time, t k+1 For the moment of termination of scanning->The coordinates of each point of the laser point cloud of the current frame under the laser radar coordinate system at the initial scanning moment are P i And the coordinates of each point of the laser point cloud of the current frame under the laser radar coordinate system at the current moment.
S102, extracting point-line characteristics and point-plane characteristics according to the curvatures of the points of the laser point cloud of the current frame after correction, and matching the point-line characteristics and the point-plane characteristics with a local map;
it should be noted that, the map itself is composed of features extracted from each frame, the matching function is to obtain the coordinate transformation matrix of the current frame relative to the map, and the current frame can be spliced to the global map accurately by using the coordinate transformation matrix;
specifically, in the present embodiment, the first and second embodiments,
and calling a curvature calculation model to generate the curvature of each point of the laser point cloud of the current frame, wherein the curvature calculation model is specifically as follows:
wherein K is the curvature of the laser spot Yun Di i point, d i D is the distance from the ith point to the origin of a laser radar coordinate system i+j(j=-5,-4,...,5) The distance between the five laser points before and after the ith point and the origin of the laser radar coordinate system;
feature extraction and division are carried out on the laser point cloud of the current frame according to a preset curvature threshold value, points with curvature of each point of the laser point cloud of the current frame being larger than the curvature threshold value are judged to be corner points, and otherwise, the points are judged to be face points;
Voxel filtering is carried out on the corner points and the face points, and a characteristic point cloud of the current frame is generated;
searching a point closest to the corner feature in the feature point cloud of the current frame in the local map by using a kd tree, connecting the points into a line, and generating the distance from the corner in the feature point cloud of the current frame to the line, wherein the calculation model is specifically as follows:
wherein,,is a coordinate point of the ith corner point in the characteristic point cloud of the current frame at the moment k+1 under the world coordinate system, and is->And->Coordinate points of the ith corner point under two world coordinate systems on the corresponding boundary line features in the local map;
for the facial point characteristics in the characteristic point cloud of the current frame, searching five points closest to the facial point characteristics in the local map by using a kd tree, and constructing the following plane equation:
Ax+By+Cz+D=0;
wherein A, B, C, D is a plane equation coefficient, and x, y and z are five-point coordinates closest to the characteristic point cloud surface point characteristic of the current frame in the local map;
substituting five-point coordinates corresponding to the characteristic point cloud surface point characteristics of the current frame in the local map into a plane equation to obtain coefficients of the plane equation, solving the distance from the surface point of the characteristic point cloud of the current frame to the characteristic plane of the local map according to the coefficients of the plane equation, and calculating the model as follows:
Wherein x= [ x ] p ,y p ,z p ] T Coordinates of the face points in the characteristic point cloud of the current frame in a world coordinate system;
according to the distance between the laser point cloud point of the current frame and the boundary characteristic corresponding to the local map and the distance between the laser point cloud point of the current frame and the plane characteristic corresponding to the local map, constructing the following residual equation:
wherein,,for the coordinates of the current frame feature point Yun Di i point in the world coordinate system, +.>For the coordinates of the current frame characteristic point Yun Di i point in the laser radar coordinate system, +.>Transforming the current frame characteristic point Yun Di i point from a laser radar coordinate system to a coordinate transformation matrix under a world coordinate system, wherein m is a local map under the world coordinate system, and R and t are respectively a rotation matrix and a translation matrix of the current frame characteristic point Yun Di i point from the laser radar coordinate system to the world coordinate system;
and constructing the following iterative equation by utilizing the residual equation and combining a Levenberg algorithm:
(J T J+μI)·VT=-J T f
wherein J is a jacobian matrix of a residual equation relative to iteration variables, mu is a diagonal update operator, I is an identity matrix, deltaT is a single iteration variable increment, f is the residual equation,for the coordinates of the current frame characteristic point Yun Di i point in the laser radar coordinate system, +.>Transforming the current frame characteristic point Yun Di i point from a laser radar coordinate system to a coordinate transformation matrix under a world coordinate system, wherein m is a local map under the world coordinate system;
The residual sum of feature matching is calculated, and the calculation model is as follows:
wherein d εi D, regarding the residual value of the ith corner point of the characteristic point cloud of the current frame and the corresponding edge line characteristic of the local map as the residual value of d Hj The j-th face point of the characteristic point cloud of the current frame and the residual value of the plane characteristic corresponding to the local map are respectively the number of corner points and face points in the characteristic point cloud of the current frame;
and updating the diagonal updating operator according to the residual sum of the feature matching, and increasing the diagonal operator when the residual sum is overlarge, otherwise, decreasing the diagonal operator.
S103, judging whether the laser point cloud of the current frame is a key frame or not;
it should be noted that, the key frame is determined by taking the distance as a determination condition, for example, if the distance between the current frame and the previous key frame exceeds d, the current frame is determined to be the key frame, in this embodiment, the key frame may be determined by the data collected by the laser odometer, or may be determined by other manners in other embodiments, which are not limited herein, but these schemes are all within the protection scope of the present invention.
S104, when the current frame laser point cloud is judged to be a non-key frame, correcting the zero offset of the IMU, and resetting an IMU pre-integrator;
defining a pre-integrator for optimizing zero offset of the IMU and a pre-integrator for calculating the IMU odometer respectively;
Acquiring a laser odometer, updating a factor graph, judging whether a factor tree in the updated factor graph exceeds a preset value (the preset value can be but is not limited to 100), and reconstructing the factor graph if the factor tree exceeds the preset value;
adding an IMU original data pre-integral factor, a zero offset factor and a current frame laser odometer factor between the current frame laser point cloud and the previous frame laser point cloud into the updated factor graph;
taking the pose and the speed estimated by the pre-integrator for optimizing the IMU zero offset as the estimated values of the current frame pose factor and the current frame rate factor, taking the zero offset optimized by the previous frame as the estimated value of the current frame zero offset factor, and performing factor graph optimization to generate the speed and the zero offset after optimization and updating;
resetting a pre-integrator for optimizing zero offset of the IMU by using the zero offset after optimization and updating;
checking the speed and zero offset after optimization updating, and reconstructing a factor graph to optimize if the speed and zero offset do not meet the checking requirement;
resetting the IMU mileometer pre-integrator by using the optimized and updated zero offset, and pre-integrating IMU original data after the current frame laser point cloud moment to obtain a pre-integration result;
and estimating the state of the current frame laser point cloud moment by utilizing the state of the current frame laser point cloud moment and IMU zero offset obtained through factor graph optimization and combining the pre-integration result, and outputting the state as a high-frequency odometer.
And S105, when judging that the laser point cloud of the current frame is a key frame, performing factor graph optimization on the laser point cloud of the current frame, registering a local map formed by the laser point cloud of the current frame and a historical frame by using a normal distribution transformation algorithm, and optimizing and correcting the global pose.
Specifically, in the present embodiment:
when judging that the laser point cloud of the current frame is a key frame, calling a loop detection thread based on a laser odometer, wherein the problem of insufficient calculation power caused by carrying out loop detection on each key frame is solved by judging whether to call the loop detection thread based on the laser odometer even when the current frame is detected as the key frame;
judging whether a history key frame similar to the current key frame exists in a history key frame set according to the loop detection thread;
if so, judging whether the time stamp interval between the adjacent historical key frame and the current key frame is larger than a threshold value, and if so, matching the current key frame with the adjacent historical key frame by using a normal distribution transformation algorithm to generate a relative coordinate transformation matrix;
and optimizing and correcting the global pose by using the relative coordinate transformation matrix.
In one possible embodiment of the present invention, the matching between the current key frame and the similar historical key frame by using a normal distribution transformation algorithm generates a relative coordinate transformation matrix, specifically:
and rasterizing the similar historical key frames serving as target point clouds, and calculating probability density functions of laser point cloud distribution of each grid under the condition that each grid accords with normal distribution, wherein a calculation model is as follows:
wherein D is the dimension of the vector,the mean value vector is the covariance matrix of the random vector;
putting the target point cloud into each grid, and calculating the mean value of laser point cloud data in each grid, wherein a calculation model is as follows:
and calculating covariance of point cloud data in each grid by using a covariance calculation formula, wherein a calculation model is as follows:
the current key frame is put into each grid to be used as a source point cloud registered by a normal distribution transformation algorithm point cloud, the probability of each point of the source point cloud in each grid is calculated, and a calculation model body is as follows:
multiplying the probabilities of each point of the source point cloud on each grid to obtain a maximum likelihood function, and when the maximum likelihood function reaches the maximum value, the relative coordinate transformation matrix at the moment is the optimal solution, and the calculation model body is as follows:
Wherein R is 3×3 For a rotation matrix of a source point cloud relative to a target point cloud, t 3×1 A translation matrix of the source point cloud relative to the target point cloud, T is a coordinate transformation matrix of the source point cloud relative to the target point cloud,for the coordinates of the points of the source point cloud in the target point cloud coordinate system,/for the coordinates of the points of the source point cloud in the target point cloud coordinate system>The coordinates of each point of the source point cloud under the self coordinate system;
the logarithm is taken from the two ends of the maximum likelihood function, and the model body is calculated as follows:
solving the maximum value of the maximum likelihood function by utilizing a Newton iteration method, taking a coordinate transformation matrix of a source point cloud relative to a target point cloud as an iteration variable, and solving the coordinate transformation matrix corresponding to the maximum value of the maximum likelihood function by continuously updating the iteration variable along the gradient descending direction, wherein a calculation model body is as follows:
T=T+ΔT
where H is the black matrix of the maximum likelihood function,and T is a coordinate transformation matrix of the source point cloud relative to the target point cloud.
Referring to fig. 2, a second embodiment of the present invention provides a mapping apparatus for a construction machine, including:
the distortion correction unit 201 is used for acquiring environment information acquired by the vehicle-mounted multi-sensor fusion platform, and performing motion distortion correction on the current frame laser point cloud of the environment information by using IMU data and an IMU odometer;
A map matching unit 202, configured to extract point-line features and point-surface features according to the corrected curvatures of the points of the laser point cloud of the current frame, and match the point-line features and the point-surface features with a local map;
a judging unit 203, configured to judge whether the current frame laser point cloud is a key frame;
an IMU correction unit 204, configured to correct zero offset of the IMU, and reset the IMU pre-integrator;
the global optimization unit 205 is configured to perform factor graph optimization on the current frame laser point cloud, register a local map formed by the current frame laser point cloud and the historical frame by using a normal distribution transformation algorithm, and optimize and correct the global pose.
A third embodiment of the present invention provides a mapping apparatus for an engineering machine, including a memory and a processor, where the memory stores a computer program, and the computer program is capable of being executed by the processor to implement a mapping method for an engineering machine as set forth in any one of the above.
A fourth embodiment of the present invention provides a computer readable storage medium, in which a computer program is stored, where the computer program can be executed by a processor of an apparatus in which the computer readable storage medium is located, to implement a mapping method of an engineering machine according to any one of the above.
The key points of the technical scheme are that IMU data and an IMU odometer are utilized to correct motion distortion of a current frame laser point cloud, point line and point surface characteristics are extracted according to the curvature of each point of the current frame laser point cloud, a local map is matched with the current frame laser point cloud, and factor map optimization is carried out on a key frame. And correcting the zero offset of the IMU according to the output laser odometer, resetting the IMU pre-integrator, and outputting the IMU odometer. When the current frame laser point cloud meets loop detection conditions, a normal distribution transformation algorithm is utilized to carry out loop detection on the current frame laser point cloud, and the global pose is updated, so that the problem of image building under the degradation scene of engineering machinery is solved, and quick image building and positioning under a large scene are realized.
Illustratively, the computer programs described in the third and fourth embodiments of the present invention may be divided into one or more modules, which are stored in the memory and executed by the processor to complete the present invention. The one or more modules may be a series of computer program instruction segments capable of performing the specified functions, for describing the execution of the computer program in the mapping apparatus embodying a work machine. For example, the device described in the second embodiment of the present invention.
The processor may be a central processing unit (Central Processing Unit, CPU), other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. The general purpose processor may be a microprocessor or the processor may be any conventional processor, etc., where the processor is a control center of a mapping method of the engineering machine, and various interfaces and lines are used to connect various parts of the mapping method of the engineering machine to the whole implementation.
The memory may be used to store the computer program and/or the module, and the processor may implement various functions of a mapping method of the construction machine by running or executing the computer program and/or the module stored in the memory and invoking data stored in the memory. The memory may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program (such as a sound playing function, a text conversion function, etc.) required for at least one function, and the like; the storage data area may store data (such as audio data, text message data, etc.) created according to the use of the cellular phone, etc. In addition, the memory may include high-speed random access memory, and may also include non-volatile memory, such as a hard disk, memory, plug-in hard disk, smart Media Card (SMC), secure Digital (SD) Card, flash Card (Flash Card), at least one disk storage device, flash memory device, or other volatile solid-state storage device.
Wherein the modules may be stored in a computer readable storage medium if implemented in the form of software functional units and sold or used as a stand alone product. Based on this understanding, the present invention may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and the computer program may implement the steps of each method embodiment described above when executed by a processor. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the computer readable medium contains content that can be appropriately scaled according to the requirements of jurisdictions in which such content is subject to legislation and patent practice, such as in certain jurisdictions in which such content is subject to legislation and patent practice, the computer readable medium does not include electrical carrier signals and telecommunication signals.
It should be noted that the above-described apparatus embodiments are merely illustrative, and the units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. In addition, in the drawings of the embodiment of the device provided by the invention, the connection relation between the modules represents that the modules have communication connection, and can be specifically implemented as one or more communication buses or signal lines. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
The present invention is not limited to the above-mentioned embodiments, and any changes or substitutions that can be easily understood by those skilled in the art within the technical scope of the present invention are intended to be included in the scope of the present invention. Therefore, the protection scope of the present invention should be subject to the protection scope of the claims.
Claims (9)
1. The construction method of the engineering machinery is characterized by comprising the following steps:
acquiring environment information acquired by a vehicle-mounted multi-sensor fusion platform, and correcting motion distortion of a current frame laser point cloud of the environment information by using IMU data and an IMU odometer;
extracting point-line characteristics and point-plane characteristics according to the curvature of each point of the corrected laser point cloud of the current frame, and matching the point-line characteristics and the point-plane characteristics with a local map;
judging whether the laser point cloud of the current frame is a key frame or not;
if not, correcting the zero offset of the IMU, and resetting the IMU pre-integrator;
if so, performing factor graph optimization on the laser point cloud of the current frame, registering a local map formed by the laser point cloud of the current frame and a historical frame by using a normal distribution transformation algorithm, and optimizing and correcting the global pose.
2. The mapping method of the engineering machinery according to claim 1, wherein the obtaining the environmental information collected by the vehicle-mounted multi-sensor fusion platform and performing motion distortion correction on the current frame laser point cloud of the environmental information by using IMU data and an IMU odometer comprises the following steps:
pre-integrating the IMU data to obtain a coordinate transformation matrix of the laser radar from the end scanning moment to the start scanning moment of the current frame of laser point cloud;
Performing motion distortion correction on the laser point cloud of the current frame by using the coordinate transformation matrix; wherein, the coordinate transformation matrix is:
wherein T is i k For the coordinate transformation matrix of the laser point cloud points of the current frame relative to the initial scanning moment,a coordinate transformation matrix t of the laser radar from the scanning ending moment to the scanning starting moment i For the measurement time t of each point of the laser point cloud of the current frame in the scanning period k To scan the starting time, t k+1 For the moment of termination of scanning->The coordinates of each point of the laser point cloud of the current frame under the laser radar coordinate system at the initial scanning moment are P i And the coordinates of each point of the laser point cloud of the current frame under the laser radar coordinate system at the current moment.
3. The mapping method of the engineering machine according to claim 1, wherein the extracting of the point-line characteristic and the point-plane characteristic according to the curvature of each point of the corrected laser point cloud of the current frame and the matching of the point-line characteristic and the point-plane characteristic with a local map are specifically as follows:
and calling a curvature calculation model to generate the curvature of each point of the laser point cloud of the current frame, wherein the curvature calculation model is specifically as follows:
wherein K is the curvature of the laser spot Yun Di i point, d i D is the distance from the ith point to the origin of a laser radar coordinate system i+j(j=-5,-4,...,5) The distance between the five laser points before and after the ith point and the origin of the laser radar coordinate system;
feature extraction and division are carried out on the laser point cloud of the current frame according to a preset curvature threshold value, points with curvature of each point of the laser point cloud of the current frame being larger than the curvature threshold value are judged to be corner points, and otherwise, the points are judged to be face points;
voxel filtering is carried out on the corner points and the face points, and a characteristic point cloud of the current frame is generated;
searching a point closest to the corner feature in the feature point cloud of the current frame in the local map by using a kd tree, connecting the points into a line, and generating the distance from the corner in the feature point cloud of the current frame to the line, wherein the calculation model is specifically as follows:
wherein,,is the coordinate point of the ith corner point in the characteristic point cloud of the current frame at the moment k+1 under the world coordinate system,and->Coordinate points of the ith corner point under two world coordinate systems on the corresponding boundary line features in the local map;
for the facial point characteristics in the characteristic point cloud of the current frame, searching five points closest to the facial point characteristics in the local map by using a kd tree, and constructing the following plane equation:
Ax+By+Cz+D=0;
wherein A, B, C, D is a plane equation coefficient, and x, y and z are five-point coordinates closest to the characteristic point cloud surface point characteristic of the current frame in the local map;
Substituting five-point coordinates corresponding to the characteristic point cloud surface point characteristics of the current frame in the local map into a plane equation to obtain coefficients of the plane equation, solving the distance from the surface point of the characteristic point cloud of the current frame to the characteristic plane of the local map according to the coefficients of the plane equation, and calculating the model as follows:
wherein x= [ x ] p ,y p ,z p ] T Coordinates of the face points in the characteristic point cloud of the current frame in a world coordinate system;
according to the distance between the laser point cloud point of the current frame and the boundary characteristic corresponding to the local map and the distance between the laser point cloud point of the current frame and the plane characteristic corresponding to the local map, constructing the following residual equation:
wherein,,for the coordinates of the current frame feature point Yun Di i point in the world coordinate system, +.>For the coordinates of the current frame characteristic point Yun Di i point in the laser radar coordinate system, +.>Transforming the current frame characteristic point Yun Di i point from a laser radar coordinate system to a coordinate transformation matrix under a world coordinate system, wherein m is a local map under the world coordinate system, and R and t are respectively a rotation matrix and a translation matrix of the current frame characteristic point Yun Di i point from the laser radar coordinate system to the world coordinate system;
and constructing the following iterative equation by utilizing the residual equation and combining a Levenberg algorithm:
(J T J+μI)·VT=-J T f
Wherein J is a jacobian matrix of a residual equation relative to iteration variables, mu is a diagonal update operator, I is an identity matrix, deltaT is a single iteration variable increment, f is the residual equation,for the coordinates of the current frame characteristic point Yun Di i point in the laser radar coordinate system, +.>Transforming the current frame characteristic point Yun Di i point from a laser radar coordinate system to a coordinate transformation matrix under a world coordinate system, wherein m is a local map under the world coordinate system;
the residual sum of feature matching is calculated, and the calculation model is as follows:
wherein d εi D, regarding the residual value of the ith corner point of the characteristic point cloud of the current frame and the corresponding edge line characteristic of the local map as the residual value of d Hj The j-th face point of the characteristic point cloud of the current frame and the residual value of the plane characteristic corresponding to the local map are respectively the number of corner points and face points in the characteristic point cloud of the current frame;
and updating the diagonal updating operator according to the residual sum of the feature matching, and increasing the diagonal operator when the residual sum is overlarge, otherwise, decreasing the diagonal operator.
4. The method for mapping engineering machinery according to claim 1, wherein the correcting the zero offset of the IMU and resetting the IMU pre-integrator are specifically as follows:
defining a pre-integrator for optimizing zero offset of the IMU and a pre-integrator for calculating the IMU odometer respectively;
Acquiring a laser odometer, updating a factor graph, judging whether a factor tree in the updated factor graph exceeds a preset value, and reconstructing the factor graph if the factor tree exceeds the preset value;
adding an IMU original data pre-integral factor, a zero offset factor and a current frame laser odometer factor between the current frame laser point cloud and the previous frame laser point cloud into the updated factor graph;
taking the pose and the speed estimated by the pre-integrator for optimizing the IMU zero offset as the estimated values of the current frame pose factor and the current frame rate factor, taking the zero offset optimized by the previous frame as the estimated value of the current frame zero offset factor, and performing factor graph optimization to generate the speed and the zero offset after optimization and updating;
resetting a pre-integrator for optimizing zero offset of the IMU by using the zero offset after optimization and updating;
checking the speed and zero offset after optimization updating, and reconstructing a factor graph to optimize if the speed and zero offset do not meet the checking requirement;
resetting the IMU mileometer pre-integrator by using the optimized and updated zero offset, and pre-integrating IMU original data after the current frame laser point cloud moment to obtain a pre-integration result;
and estimating the state of the current frame laser point cloud moment by utilizing the state of the current frame laser point cloud moment and IMU zero offset obtained through factor graph optimization and combining the pre-integration result, and outputting the state as a high-frequency odometer.
5. The mapping method of the engineering machine according to claim 1, wherein the factor map optimization is performed on the laser point cloud of the current previous frame, and the registration is performed on the local map formed by the laser point cloud of the current frame and the history frame by using a normal distribution transformation algorithm, so as to optimize and correct the global pose, specifically:
when judging that the laser point cloud of the current frame is a key frame, calling a loop detection thread based on a laser odometer;
judging whether a history key frame similar to the current key frame exists in a history key frame set according to the loop detection thread;
if so, judging whether the time stamp interval between the adjacent historical key frame and the current key frame is larger than a threshold value, and if so, matching the current key frame with the adjacent historical key frame by using a normal distribution transformation algorithm to generate a relative coordinate transformation matrix;
and optimizing and correcting the global pose by using the relative coordinate transformation matrix.
6. The mapping method of an engineering machine according to claim 5, wherein the matching of the current key frame and the similar historical key frames by using a normal distribution transformation algorithm generates a relative coordinate transformation matrix, specifically:
And rasterizing the similar historical key frames serving as target point clouds, and calculating probability density functions of laser point cloud distribution of each grid under the condition that each grid accords with normal distribution, wherein a calculation model is as follows:
wherein D is the dimension of the vector,the mean value vector is the covariance matrix of the random vector;
putting the target point cloud into each grid, and calculating the mean value of laser point cloud data in each grid, wherein a calculation model is as follows:
and calculating covariance of point cloud data in each grid by using a covariance calculation formula, wherein a calculation model is as follows:
the current key frame is put into each grid to be used as a source point cloud registered by a normal distribution transformation algorithm point cloud, the probability of each point of the source point cloud in each grid is calculated, and a calculation model body is as follows:
multiplying the probabilities of each point of the source point cloud on each grid to obtain a maximum likelihood function, and when the maximum likelihood function reaches the maximum value, the relative coordinate transformation matrix at the moment is the optimal solution, and the calculation model body is as follows:
wherein R is 3×3 For a rotation matrix of a source point cloud relative to a target point cloud, t 3×1 A translation matrix of the source point cloud relative to the target point cloud, T is a coordinate transformation matrix of the source point cloud relative to the target point cloud, For the coordinates of the points of the source point cloud in the target point cloud coordinate system,/for the coordinates of the points of the source point cloud in the target point cloud coordinate system>The coordinates of each point of the source point cloud under the self coordinate system;
the logarithm is taken from the two ends of the maximum likelihood function, and the model body is calculated as follows:
solving the maximum value of the maximum likelihood function by utilizing a Newton iteration method, taking a coordinate transformation matrix of a source point cloud relative to a target point cloud as an iteration variable, and solving the coordinate transformation matrix corresponding to the maximum value of the maximum likelihood function by continuously updating the iteration variable along the gradient descending direction, wherein a calculation model body is as follows:
T=T+ΔT
7. The utility model provides a construction machinery's picture device which characterized in that includes:
the distortion correction unit is used for acquiring the environment information acquired by the vehicle-mounted multi-sensor fusion platform and carrying out motion distortion correction on the current frame laser point cloud of the environment information by utilizing the IMU data and the IMU odometer;
the map matching unit is used for extracting point-line characteristics and point-plane characteristics according to the curvatures of the corrected points of the laser point cloud of the current frame and matching the point-line characteristics and the point-plane characteristics with a local map;
The judging unit is used for judging whether the laser point cloud of the current frame is a key frame or not;
the IMU correction unit is used for correcting zero offset of the IMU and resetting the IMU pre-integrator;
and the global optimization unit is used for performing factor graph optimization on the laser point cloud of the current frame, registering a local map formed by the laser point cloud of the current frame and the historical frame by utilizing a normal distribution transformation algorithm, and optimizing and correcting the global pose.
8. A mapping apparatus of a construction machine, comprising a memory and a processor, wherein the memory stores a computer program executable by the processor to implement a mapping method of a construction machine according to any one of claims 1 to 6.
9. A computer readable storage medium, storing a computer program executable by a processor of a device in which the computer readable storage medium is located, to implement a mapping method of a construction machine according to any one of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310296460.2A CN116337072A (en) | 2023-03-24 | 2023-03-24 | Construction method, construction equipment and readable storage medium for engineering machinery |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310296460.2A CN116337072A (en) | 2023-03-24 | 2023-03-24 | Construction method, construction equipment and readable storage medium for engineering machinery |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116337072A true CN116337072A (en) | 2023-06-27 |
Family
ID=86885268
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310296460.2A Pending CN116337072A (en) | 2023-03-24 | 2023-03-24 | Construction method, construction equipment and readable storage medium for engineering machinery |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116337072A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117739968A (en) * | 2023-12-22 | 2024-03-22 | 武汉大学 | Robust real-time positioning and mapping method and system suitable for complex under-forest environment |
CN117974926A (en) * | 2024-03-28 | 2024-05-03 | 苏州艾吉威机器人有限公司 | Positioning and mapping method and system based on mechanical rotation type three-dimensional laser |
-
2023
- 2023-03-24 CN CN202310296460.2A patent/CN116337072A/en active Pending
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117739968A (en) * | 2023-12-22 | 2024-03-22 | 武汉大学 | Robust real-time positioning and mapping method and system suitable for complex under-forest environment |
CN117739968B (en) * | 2023-12-22 | 2024-07-09 | 武汉大学 | Robust real-time positioning and mapping method and system suitable for complex under-forest environment |
CN117974926A (en) * | 2024-03-28 | 2024-05-03 | 苏州艾吉威机器人有限公司 | Positioning and mapping method and system based on mechanical rotation type three-dimensional laser |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN116337072A (en) | Construction method, construction equipment and readable storage medium for engineering machinery | |
CN109614935B (en) | Vehicle damage assessment method and device, storage medium and electronic equipment | |
CN113189989B (en) | Vehicle intention prediction method, device, equipment and storage medium | |
EP4295268A1 (en) | Computer vision systems and methods for supplying missing point data in point clouds derived from stereoscopic image pairs | |
CN111915657A (en) | Point cloud registration method and device, electronic equipment and storage medium | |
CN115810133B (en) | Welding control method based on image processing and point cloud processing and related equipment | |
CN113298910A (en) | Method, apparatus and storage medium for generating traffic sign line map | |
CN112198878A (en) | Instant map construction method and device, robot and storage medium | |
CN112946612B (en) | External parameter calibration method and device, electronic equipment and storage medium | |
CN114092771A (en) | Multi-sensing data fusion method, target detection device and computer equipment | |
CN116358528A (en) | Map updating method, map updating device, self-mobile device and storage medium | |
CN116543143A (en) | Training method of target detection model, target detection method and device | |
CN115406452A (en) | Real-time positioning and mapping method, device and terminal equipment | |
CN115619954A (en) | Sparse semantic map construction method, device, equipment and storage medium | |
CN112364693B (en) | Binocular vision-based obstacle recognition method, device, equipment and storage medium | |
CN113808196A (en) | Plane fusion positioning method and device, electronic equipment and storage medium | |
CN115388878A (en) | Map construction method and device and terminal equipment | |
CN115236643A (en) | Sensor calibration method, system, device, electronic equipment and medium | |
CN113721240A (en) | Target association method and device, electronic equipment and storage medium | |
CN109711363B (en) | Vehicle positioning method, device, equipment and storage medium | |
CN110763243B (en) | Sliding map updating method and device | |
CN113763468A (en) | Positioning method, device, system and storage medium | |
CN112085119A (en) | Data processing method, device, equipment and storage medium | |
CN117765097A (en) | Sensor external parameter calibration method and device, computer equipment and storage medium | |
CN114419128A (en) | Non-contact space perimeter detection method, device, equipment and medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |