CN108871353B - Road network map generation method, system, equipment and storage medium - Google Patents

Road network map generation method, system, equipment and storage medium Download PDF

Info

Publication number
CN108871353B
CN108871353B CN201810708643.XA CN201810708643A CN108871353B CN 108871353 B CN108871353 B CN 108871353B CN 201810708643 A CN201810708643 A CN 201810708643A CN 108871353 B CN108871353 B CN 108871353B
Authority
CN
China
Prior art keywords
road network
point cloud
network map
point
map generation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810708643.XA
Other languages
Chinese (zh)
Other versions
CN108871353A (en
Inventor
张波
杨骋
俞铭琪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Xijing Technology Co ltd
Original Assignee
Shanghai Westwell Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Westwell Information Technology Co Ltd filed Critical Shanghai Westwell Information Technology Co Ltd
Priority to CN201810708643.XA priority Critical patent/CN108871353B/en
Publication of CN108871353A publication Critical patent/CN108871353A/en
Application granted granted Critical
Publication of CN108871353B publication Critical patent/CN108871353B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • G01C21/32Structuring or formatting of map data

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides a road network map generation method, a road network map generation system, road network map generation equipment and a storage medium, wherein the road network map generation method comprises the steps of acquiring acquisition data of a laser radar on a mobile acquisition device, acquiring multi-frame point cloud data, and acquiring acquisition data of an inertia measurement unit on the mobile acquisition device corresponding to each frame, wherein the acquisition data of the inertia measurement unit comprises the linear velocity and the angular velocity of the mobile acquisition device in a current frame; generating a high-precision map according to the point cloud data and the acquired data of the inertial measurement unit; and extracting the road network map from the high-precision map. The invention provides a road network map generation technical scheme, which can automatically extract and generate a road network map, solves the problems of low precision and low efficiency of a manual extraction method in the prior art, can be used for extracting the road network map in a specific open industrial area, and has the characteristics of high automation, high robustness and high precision.

Description

Road network map generation method, system, equipment and storage medium
Technical Field
The invention relates to the technical field of automatic driving, in particular to a road network map generation method, a road network map generation system, road network map generation equipment and a storage medium, which are suitable for automatic driving.
Background
With the continuous evolution of automatic driving technology, the requirements on high-precision maps and road network maps are higher and higher, and automatic driving scenes above the L4 level all require maps at the centimeter level. The high-precision map is a machine-oriented high-precision map for an automatic driving automobile, not only has high-precision coordinates, but also has an accurate road shape, and contains data of the gradient, curvature, course, elevation and heeling of each lane. The high-precision map not only describes roads, but also describes how many lanes are on one road, and can truly reflect the actual style of the roads. The road network map is information extraction of a high-precision map, is used for path planning of an automatic driving vehicle, and particularly is a map of a mesh system composed of roads at all levels restored from the high-precision map, so that the road network map also comprises data of the road level and the lane level related to the roads in the high-precision map. Nowadays, the precision and accuracy of extracting road network maps become one of the bottlenecks of high-level automatic driving.
Disclosure of Invention
Aiming at the problems in the prior art, the invention aims to provide a road network map generation method, a road network map generation system, road network map generation equipment and a road network map generation storage medium, and solves the problems of low precision and low efficiency of a manual extraction method.
The embodiment of the invention provides a road network map generation method, which comprises the following steps:
s100: acquiring data acquired by a laser radar on a mobile acquisition device running in a preset scene to obtain multi-frame point cloud data, and acquiring data acquired by an inertia measurement unit on the mobile acquisition device corresponding to each frame, wherein the data acquired by the inertia measurement unit comprises the linear velocity and the angular velocity of the mobile acquisition device at the current frame;
s200: generating a high-precision map according to the point cloud data and the acquired data of the inertial measurement unit;
s300: extracting a road network map from the high-precision map;
wherein, the step S200 includes the following steps:
s201: filtering each frame of point cloud to obtain point cloud data in a preset height range above the road surface;
s202: registering the point clouds of all frames in sequence, taking the currently registered frame as the registered point cloud, and selecting a reference point cloud;
s203: dividing the space occupied by the reference point cloud into a plurality of grids with preset sizes, calculating the mean value and covariance matrix of multi-dimensional normal distribution of each grid, and initializing the iteration times;
s204: calculating to obtain a rotation matrix and a translation matrix according to the linear velocity and the angular velocity of the mobile acquisition device in the current frame;
s205: for the point cloud to be registered, converting the points in the frame of point cloud into a grid of a reference point cloud according to the rotation matrix and the translation matrix;
s206: calculating the probability density of each conversion point according to the normal distribution parameters;
s207: averaging the probability densities of all the conversion points to obtain an NDT registration score;
s208: judging whether the NDT registration score reaches a preset convergence condition, if so, continuing to step S210, otherwise, continuing to step S209;
s209: optimizing the NDT registration score according to a Newton optimization algorithm to obtain an updated linear velocity and an updated angular velocity, replacing the linear velocity and the angular velocity of the mobile acquisition device at the current frame with the updated linear velocity and the updated angular velocity, adding one to the iteration number, and continuing to step S204;
s210: and taking the current rotation matrix and translation matrix as the rotation matrix and translation matrix of the current frame, recording the rotation matrix and translation matrix of all frames, and superposing the original point cloud frame on the reference point cloud to obtain the point cloud map under the world coordinate system.
Optionally, the step S203 of calculating a mean and a covariance matrix of the multidimensional normal distribution of each grid includes the following steps:
obtaining coordinate values Pi (x) of each point in the current gridi,yi,zi) I belongs to (1, n), and n is the number of the middle points of the grid;
the mean value Pmean (x) of the multi-dimensional normal distribution of the current grid is calculated according to the following formulam,ym):
Figure BDA0001716163840000021
The covariance matrix of the current grid is calculated according to the following formula:
Figure BDA0001716163840000031
Figure BDA0001716163840000032
Figure BDA0001716163840000033
Figure BDA0001716163840000034
C=[cov(x,x),cov(x,y),cov(y,x),cov(y,y)]
wherein, the calculated C is the covariance matrix of the current grid.
Optionally, in step S204, the step of calculating a rotation matrix and a translation matrix according to the linear velocity and the angular velocity of the current frame includes the following steps:
and calculating a rotation matrix R and a translation matrix T of the current frame according to the following formula:
T=[V(xv)*t,V(yv)*t,V(zv)*t];
R=[W(xw)*t,0,00,W(yw)*t,00,0,W(zw)*t]
where t is the interval duration between two frames, V (x)v,yv,zv) Linear velocity of the current frame, W (x)w,yw,zw) Is the angular velocity of the current frame.
Optionally, in the step S205, for the point cloud to be registered, converting the point of the frame of point cloud into the mesh of the reference point cloud according to the rotation matrix and the translation matrix, including the following steps:
the registered point cloud P1 is transformed into the mesh of the reference point cloud by a rotation matrix R and a translation matrix T according to the following formula:
P2=P1*R+T
where P1 is the registered point cloud before transformation, P2 is the transformed point cloud, and represents a matrix multiplication.
Optionally, in the step S206, calculating the probability density of each conversion point according to the normal distribution parameter includes the following steps:
the probability density for each transition point is calculated according to the following formula:
Figure BDA0001716163840000041
ρ=(sqrt(C(0,1)+C(1.0)))2
wherein, theThe coordinates of the transformation point are (x, y), mu1And mu2Respectively, the mean values Pmean (x) of the multi-dimensional normal distribution of the current gridm) And Pmean (y)m),σ1And σ2The values of C (0,0) and C (1,1), respectively.
Optionally, in the step S208, determining whether the NDT registration score is greater than a preset convergence condition includes the following steps:
if the current iteration number reaches a preset iteration number N, judging whether the maximum fluctuation value of the NDT registration score in the iteration of nearly N times is larger than a preset fluctuation threshold value or not, if so, judging that the NDT registration score does not reach a preset convergence condition, otherwise, judging that the NDT registration score reaches the preset convergence condition;
and if the current iteration times do not reach the preset iteration times N, judging that the NDT registration score does not reach the preset convergence condition.
Optionally, in step S209, optimizing the NDT registration score according to a newton optimization algorithm to obtain an updated linear velocity and an updated angular velocity, including the following steps:
calculating the sum of probability densities of all the conversion points
Figure BDA0001716163840000042
Where fi (x)i,yi) Taking the probability density of the ith conversion point, n is the number of the conversion points in the current grid, and the sum L is taken as an iterative objective function;
the angular velocity of the current frame is updated according to the following formula:
W(xw)k+1=W(xw)k–L'(W(xw)k)/L”(W(xw)k)
W(yw)k+1=W(yw)k–L'(W(yw)k)/L”(W(yw)k)
W(zw)k+1=W(zw)k–L'(W(zw)k)/L”(W(zw)k)
where k is the current number of iterations, W: (xw)k、W(yw)kAnd W (z)w)kFor the angular velocity component before updating, W (x)w)k+1、W(yw)k+1And W (z)w)k+1For the updated angular velocity component, L' and L "are the first and second derivatives of the objective function, respectively;
updating the linear velocity of the current frame according to the following formula:
V(xv)k+1=V(xv)k–L'(V(xv)k)/L”(V(xv)k)
V(yv)k+1=V(yv)k–L'(V(yv)k)/L”(V(yv)k)
V(zv)k+1=V(zv)k–L'(V(zv)k)/L”(V(zv)k)
wherein, V (x)v)k、V(yv)kAnd V (z)v)kFor the linear velocity component before update, V (x)v)k+1、V(yv)k+1And V (z)v)k+1Is the updated linear velocity component.
Optionally, between steps S200 and S300, the method further includes the following steps:
s211: and acquiring an image acquired by the camera, and adding RGB values to the point cloud map under the world coordinate system according to the pose transformation relation between the camera and the laser radar.
Optionally, the step S211 includes the following steps:
acquiring an image acquired by a camera, and projecting a point obtained by laser radar measurement back to a two-dimensional coordinate system of the camera according to the pose transformation relation of the laser radar and the camera;
finding the RGB value of a pixel point closest to the projection point in the image collected by the camera, and taking the RGB value as the RGB value of the point obtained by current laser radar measurement;
and if the projection point is out of the image collected by the camera, setting the RGB value of the point measured by the current laser radar as a default value.
Optionally, the step S300 of extracting the road network map from the high-precision map includes the following steps:
s301: extracting the pavement by using a region growing algorithm, and using the normal vector and the curvature as a communication basis for region growing;
s302: filtering the extracted point cloud of the road surface by adopting a preset light intensity threshold value, and keeping the point cloud of which the light intensity is greater than the preset light intensity threshold value to obtain the point cloud of the lane line;
s303: carrying out Euclidean distance clustering on the obtained point clouds of the lane lines to obtain clustered point clouds of all the lane lines;
s304: for each class, adopting straight line fitting to obtain the characteristic parameters (a, b, c, x) of each lane line1,y1,z1,x2,y2,z2) Wherein a, b and c are the straight line fitting results of the lane lines, x1,y1,z1Is a coordinate value, x, of an end point of the lane line2,y2,z2And the coordinate value of the other end point of the lane line.
The embodiment of the present invention further provides a road network map generating system, configured to implement the road network map generating method, where the system includes:
the data acquisition module is used for acquiring the acquisition data of the laser radar, obtaining multi-frame point cloud data and acquiring the acquisition data of the inertia measurement unit corresponding to each frame;
the high-precision map generation module is used for generating a high-precision map according to the point cloud data and the acquired data of the inertial measurement unit;
and the road network map extraction module is used for extracting the road network map from the high-precision map.
The present invention also provides a road network map generating device, comprising:
a processor;
a memory having stored therein executable instructions of the processor;
wherein said processor is configured to perform the steps of said road network map generation method via execution of said executable instructions.
The present invention also provides a computer-readable storage medium for storing a program, which when executed implements the steps of the road network map generation method.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
The road network map generation method, the road network map generation system, the road network map generation equipment and the road network map generation storage medium have the following advantages:
the invention provides a road network map generation technical scheme, which can automatically extract and generate a road network map, solves the problems of low precision and low efficiency of a manual extraction method in the prior art, can be used for extracting the road network map in a specific open industrial area, and has the characteristics of high automation, high robustness and high precision.
Drawings
Other features, objects and advantages of the present invention will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, with reference to the accompanying drawings.
Fig. 1 is a flowchart of a road network map generation method according to an embodiment of the present invention;
FIG. 2 is a flow chart of generating a high accuracy map from the point cloud data and the acquired data of the inertial measurement unit according to an embodiment of the present invention;
FIG. 3 is a flow chart of extracting a road network map from a high-precision map according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a doubly linked list based map structure according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a road network map generation system according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a road network map generating device according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of a computer-readable storage medium according to an embodiment of the present invention.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
As shown in fig. 1, an embodiment of the present invention provides a road network map generating method, including the following steps:
s100: acquiring sensor data: acquiring data acquired by a laser radar on a mobile acquisition device running in a preset scene to obtain multi-frame point cloud data, and acquiring data acquired by an inertia measurement unit on the mobile acquisition device corresponding to each frame, wherein the data acquired by the inertia measurement unit comprises the linear velocity and the angular velocity of the mobile acquisition device at the current frame;
in this embodiment, the sensor group for data acquisition comprises an inertial measurement unit, at least one lidar, and at least one high-dynamic fisheye camera. The support frame is machined by a digital machine tool, the sensor is fixedly connected to the data acquisition device, and the data acquisition device can move in a specific open industrial area, so that multi-frame point cloud data of different time can be acquired, and linear velocity and angular velocity of the mobile acquisition device at each moment can be acquired according to the inertial measurement unit. The mobile acquisition device can be various movable devices such as a vehicle, a robot, an unmanned aerial vehicle and the like provided with a laser radar and an inertia measurement unit.
S200: generating a high-precision map according to the point cloud data and the acquired data of the inertial measurement unit;
s300: extracting a road network map from the high-precision map;
as shown in fig. 2, the step S200 includes the following steps:
s201: filtering each frame of point cloud to obtain point cloud data in a preset height range above the road surface; for example, the higher point cloud data is removed, and point cloud data 50cm above the road surface is left, but the invention is not limited to this, and the specific height can be set according to the need;
s202: registering the point clouds of all frames in sequence, taking the currently registered frame as the registered point cloud, and selecting a reference point cloud as a world map;
s203: dividing the space occupied by the reference point cloud into a plurality of grids with preset sizes, calculating the mean value and covariance matrix of multi-dimensional normal distribution of each grid, and initializing the value of iteration times k to be 0;
s204: calculating to obtain a rotation matrix and a translation matrix according to the linear velocity and the angular velocity of the mobile acquisition device in the current frame, wherein if the current frame is subjected to the first iteration, the linear velocity and the angular velocity of the current frame are output data of the inertia measurement unit, and if the current frame is not subjected to the first iteration, the linear velocity and the angular velocity are updated in the step S209 in the previous iteration;
s205: for the point cloud to be registered, converting the points in the frame of point cloud into a grid of a reference point cloud according to the rotation matrix and the translation matrix;
s206: calculating the probability density of each conversion point according to the normal distribution parameters;
s207: averaging the probability densities of all the conversion points to obtain an NDT registration score;
s208: judging whether the NDT registration score reaches a preset convergence condition, if so, continuing to step S210, otherwise, continuing to step S209;
s209: optimizing the NDT registration score according to a Newton optimization algorithm to obtain updated linear velocity and angular velocity, then replacing the linear velocity and the angular velocity of the current frame with the updated linear velocity and angular velocity, adding one to the value of the iteration number k, and continuing to step S204;
s210: and taking the current rotation matrix and translation matrix as the rotation matrix and translation matrix of the current frame, registering each frame in the manner of steps S204-S210 to obtain the rotation matrix and translation matrix of each frame, and superposing the original point cloud frame measured by the laser radar on the reference point cloud according to the rotation matrix and translation matrix of all the frames to obtain the point cloud map under the world coordinate system.
In this embodiment, each sensor is first calibrated. Calibration may be performed according to the following procedure. The calibration method is only an example, and the present invention is not limited thereto, and other existing sensor calibration methods are also possible and all fall within the scope of the present invention.
1. Calibrating the fisheye camera:
(1) the calibration plate is a black and white checkerboard with 11 x 9;
(2) carrying out binarization, corrosion and expansion processing on an input image;
(3) traversing each contour, if the contour is a closed shape and has 4 angular points and the side length is approximate, considering the contour as an internal square, and pushing the contour into a queue;
(4) searching adjacent grids, and marking the grids as the adjacent grids of the current grids;
(5) searching interconnected squares, traversing all the squares, if the square has adjacent squares and is not classified, classifying the square into a new group, and if the adjacent squares of the adjacent squares are not classified, adding the new square into a queue;
(6) and (5) sorting the squares. Sequencing all the interconnected grids, inspecting the number of adjacent grids of each grid, and inspecting which column and row each grid is positioned in by whether the coordinates of the corner points are on a straight line;
(7) sub-pixel corner points are found. For the finally extracted squares, finding out the precise angular points of the sub-pixels of the squares, and returning the angular points;
(8) the distortion parameters of the fisheye camera are established as follows: { k1, k2, k3, k4}, the intrinsic parameter matrix form is as follows:
Figure BDA0001716163840000091
(9) continuously performing iterative operation by using a Jacobian matrix to finally obtain a translation matrix, a rotation matrix and a rotation vector;
(10) and obtaining an error parameter by using uncertainty evaluation, and confirming to pass if the error parameter is controlled within a certain range. Finally returning the parameters and distortion coefficients in the fisheye camera;
(11) inversely transforming the image coordinate system into a camera coordinate system;
(12) then, correcting inverse transformation R-1, wherein a camera without correction transformation is generally defaulted to be an identity matrix;
(13) normalizing and carrying out camera lens distortion treatment;
(14) converted to image coordinates by camera coordinates.
2. Laser visual calibration:
(1) collecting calibration sample data of the calibration plate at a position where a camera and laser can be simultaneously seen;
(2) converting the world coordinate system into a camera coordinate system;
(3) fitting a three-dimensional plane equation where the fixed plate is located at each position;
(4) according to a plane equation of the calibration plate, solving the plane equation of the laser by using a normal vector of the calibration plate under a laser coordinate system;
(5) optimizing the product of the normal vector of the laser and the visual projection to the laser coordinate system by using a least square method;
(6) all three-dimensional laser coordinates are used to fit the laser's plane equation AX + BY + CZ ═ D, yielding plane equation coefficients A, B, C, D.
And then using a mobile acquisition device provided with a calibrated sensor to drive in a target scene to acquire data.
In this embodiment, the calculating the mean and covariance matrix of the multidimensional normal distribution of each grid in step S203 includes the following steps:
obtaining coordinate values Pi (x) of each point in the current gridi,yi,zi) I belongs to (1, n), and n is the number of the middle points of the grid;
the mean value Pmean (x) of the multi-dimensional normal distribution of the current grid is calculated according to the following formulam,ym):
Figure BDA0001716163840000101
The covariance matrix of the current grid is calculated according to the following formula:
Figure BDA0001716163840000102
Figure BDA0001716163840000103
Figure BDA0001716163840000104
Figure BDA0001716163840000105
C=[cov(x,x),cov(x,y),cov(y,x),cov(y,y)]
wherein, the calculated C is the covariance matrix of the current grid.
In this embodiment, in the step S204, a rotation matrix and a translation matrix are calculated according to the linear velocity and the angular velocity of the current frame, which includes the following steps:
and calculating a rotation matrix R and a translation matrix T of the current frame according to the following formula:
T=[V(xv)*t,V(yv)*t,V(zv)*t];
R=[W(xw)*t,0,00,W(yw)*t,00,0,W(zw)*t]
where t is the interval duration between two frames, V (x)v,yv,zv) Linear velocity of the current frame, W (x)w,yw,zw) As described above, if the current frame is the angular velocity, the linear velocity and the angular velocity of the current frame are the output data of the inertial measurement unit if the current frame is the first iteration, and if the current frame is not the first iteration, the current frame is the linear velocity and the angular velocity updated in step S209 in the previous iteration.
Further, in the step S205, for the point cloud to be registered, converting the point of the frame of point cloud into the mesh of the reference point cloud according to the rotation matrix and the translation matrix, including the following steps:
the registered point cloud P1 is transformed into the mesh of the reference point cloud by a rotation matrix R and a translation matrix T according to the following formula:
P2=P1*R+T
where P1 is the registered point cloud before transformation, P2 is the transformed point cloud, and represents a matrix multiplication.
In this embodiment, in the step S206, calculating the probability density of each transition point according to the normal distribution parameter includes the following steps:
the probability density for each transition point is calculated according to the following formula:
Figure BDA0001716163840000111
ρ=(sqrt(C(0,1)+C(1.0)))2
wherein the coordinates of the conversion point are (x, y), mu1And mu2Respectively, the mean values Pmean (x) of the multi-dimensional normal distribution of the current gridm) And Pmean (y)m),σ1And σ2The value of C (0,0) and the value of C (1,1), respectively, i.e., the data of the (0,0) position and the data of the (1,1) position in the matrix C.
In this embodiment, in the step S208, determining whether the NDT (Normal distribution Transform) registration score is greater than a preset convergence condition includes the following steps:
if the current iteration number reaches a preset iteration number N, judging whether the maximum fluctuation value of the NDT registration score in the iteration of nearly N times is larger than a preset fluctuation threshold value or not, if so, judging that the NDT registration score does not reach a preset convergence condition, otherwise, judging that the NDT registration score reaches the preset convergence condition; the preset fluctuation threshold and the preset iteration number can be freely set according to the requirement;
and if the current iteration times do not reach the preset iteration times N, judging that the NDT registration score does not reach the preset convergence condition.
In this embodiment, in step S209, the NDT registration score is optimized according to a newton optimization algorithm to obtain updated linear velocity and angular velocity, that is, the optimal rotation matrix and translation matrix of the current frame are found to maximize the NDT registration score value, specifically, step S209 includes the following steps:
calculating the sum of probability densities of all the conversion points
Figure BDA0001716163840000121
Where fi (x)i,yi) Taking the probability density of the ith conversion point, n is the number of the conversion points in the current grid, and the sum L is taken as an iterative objective function;
the angular velocity of the current frame is updated according to the following formula:
W(xw)k+1=W(xw)k–L'(W(xw)k)/L”(W(xw)k)
W(yw)k+1=W(yw)k–L'(W(yw)k)/L”(W(yw)k)
W(zw)k+1=W(zw)k–L'(W(zw)k)/L”(W(zw)k)
where k is the current iteration number, W (x)w)k、W(yw)kAnd W (z)w)kFor the angular velocity component before updating, W (x)w)k+1、W(yw)k+1And W (z)w)k+1For the updated angular velocity component, L' and L "are the first and second derivatives of the objective function, respectively;
updating the linear velocity of the current frame according to the following formula:
V(xv)k+1=V(xv)k–L'(V(xv)k)/L”(V(xv)k)
V(yv)k+1=V(yv)k–L'(V(yv)k)/L”(V(yv)k)
V(zv)k+1=V(zv)k–L'(V(zv)k)/L”(V(zv)k);
wherein, V (x)v)k、V(yv)kAnd V (z)v)kFor the linear velocity component before update, V (x)v)k+1、V(yv)k+1And V (z)v)k+1Is the updated linear velocity component.
Further, between the steps S200 and S300, the following steps are further included:
s211: and acquiring an image acquired by the camera, and adding RGB values to the point cloud map under the world coordinate system according to the pose transformation relation between the camera and the laser radar.
In this embodiment, the step S211 includes the steps of:
acquiring an image acquired by a camera, and projecting a point obtained by laser radar measurement back to a two-dimensional coordinate system of the camera according to the pose transformation relation of the laser radar and the camera;
finding the RGB value of a pixel point closest to the projection point in the image collected by the camera, and taking the RGB value as the RGB value of the point obtained by current laser radar measurement;
and if the projection point is out of the image collected by the camera, setting the RGB value of the point measured by the current laser radar as a default value.
In this embodiment, as shown in fig. 3, the extracting the road network map from the high-precision map in step S300 includes the following steps:
s301: the method comprises the steps of extracting a pavement by using a region growing algorithm, wherein normal vectors and curvatures serve as communication bases of region growing, and the existing region growing algorithm can be adopted for extracting the pavement by using the region growing algorithm;
s302: filtering the extracted point cloud of the road surface by adopting a preset light intensity threshold value, and reserving the point cloud of which the light intensity is greater than the preset light intensity threshold value to obtain the point cloud of the lane line, namely reserving points PI (x, y, z) > Ith, wherein PI (x, y, z) is the light intensity of the point (x, y, z), and Ith is the preset light intensity threshold value and is a fixed value;
s303: carrying out Euclidean distance clustering on the obtained point clouds of the lane lines to obtain clustered point clouds of all the lane lines;
s304: for each class, adopting straight line fitting to obtain the characteristic parameters (a, b, c, x) of each lane line1,y1,z1,x2,y2,z2) Wherein a, b and c are the straight line fitting results of the lane lines, x1,y1,z1Is a coordinate value, x, of an end point of the lane line2,y2,z2And the coordinate value of the other end point of the lane line.
The method for extracting the road network map by using the high-precision map is only mentioned here, and in practical application, other existing methods for extracting the road network map by using the high-precision map can be adopted, and all the methods are within the protection scope of the present invention.
As shown in fig. 4, the road network map of the present invention is a map based on a doubly linked list, BID1 and BID2 are forward links, FID1 and FID2 are backward links, R is curvature, speed is limit speed, slope is gradient, and type is a region type to which a point belongs. The linked list structure records lane central line, curvature and gradient as reserved fields, and the speed limit is marked based on road condition.
As shown in fig. 5, an embodiment of the present invention further provides a road network map generating system, configured to implement the road network map generating method, where the system includes:
the data acquisition module 100 is configured to acquire acquisition data of the laser radar, obtain multi-frame point cloud data, and acquire acquisition data of an inertia measurement unit corresponding to each frame;
the high-precision map generation module 200 is used for generating a high-precision map according to the point cloud data and the acquired data of the inertial measurement unit;
and a road network map extraction module 300, configured to extract a road network map from the high-precision map.
The embodiment of the invention also provides road network map generating equipment, which comprises a processor; a memory having stored therein executable instructions of the processor; wherein said processor is configured to perform the steps of said road network map generation method via execution of said executable instructions.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Thus, various aspects of the invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" platform.
An electronic device 600 according to this embodiment of the invention is described below with reference to fig. 6. The electronic device 600 shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 6, the electronic device 600 is embodied in the form of a general purpose computing device. The components of the electronic device 600 may include, but are not limited to: at least one processing unit 610, at least one memory unit 620, a bus 630 connecting the different platform components (including the memory unit 620 and the processing unit 610), a display unit 640, etc.
Wherein the storage unit stores program code executable by the processing unit 610 to cause the processing unit 610 to perform steps according to various exemplary embodiments of the present invention described in the above-mentioned electronic prescription flow processing method section of the present specification. For example, the processing unit 610 may perform the steps as shown in fig. 1.
The storage unit 620 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)6201 and/or a cache memory unit 6202, and may further include a read-only memory unit (ROM) 6203.
The memory unit 620 may also include a program/utility 6204 having a set (at least one) of program modules 6205, such program modules 6205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 630 may be one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 600 may also communicate with one or more external devices 700 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 600, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 600 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 650. Also, the electronic device 600 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 660. The network adapter 660 may communicate with other modules of the electronic device 600 via the bus 630. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 600, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage platforms, to name a few.
The embodiment of the present invention further provides a computer-readable storage medium, which is used for storing a program, and when the program is executed, the steps of the road network map generation method are implemented. In some possible embodiments, aspects of the present invention may also be implemented in the form of a program product comprising program code for causing a terminal device to perform the steps according to various exemplary embodiments of the present invention described in the above-mentioned electronic prescription flow processing method section of this specification, when the program product is run on the terminal device.
Referring to fig. 7, a program product 800 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable storage medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable storage medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
The road network map generation method, the road network map generation system, the road network map generation equipment and the road network map generation storage medium have the following advantages:
the invention provides a road network map generation technical scheme, which can automatically extract and generate a road network map, solves the problems of low precision and low efficiency of a manual extraction method in the prior art, can be used for extracting the road network map in a specific open industrial area, and has the characteristics of high automation, high robustness and high precision.
The foregoing is a more detailed description of the invention in connection with specific preferred embodiments and it is not intended that the invention be limited to these specific details. For those skilled in the art to which the invention pertains, several simple deductions or substitutions can be made without departing from the spirit of the invention, and all shall be considered as belonging to the protection scope of the invention.

Claims (11)

1. A road network map generation method is characterized by comprising the following steps:
s100: acquiring data acquired by a laser radar on a mobile acquisition device running in a preset scene to obtain multi-frame point cloud data, and acquiring data acquired by an inertia measurement unit on the mobile acquisition device corresponding to each frame, wherein the data acquired by the inertia measurement unit comprises the linear velocity and the angular velocity of the mobile acquisition device at the current frame;
s200: generating a high-precision map according to the point cloud data and the acquired data of the inertial measurement unit;
s300: extracting a road network map from the high-precision map;
wherein, the step S200 includes the following steps:
s201: filtering each frame of point cloud to obtain point cloud data in a preset height range above the road surface;
s202: registering the point clouds of all frames in sequence, taking the currently registered frame as the registered point cloud, and selecting a reference point cloud;
s203: dividing the space occupied by the reference point cloud into a plurality of grids with preset sizes, calculating the mean value and covariance matrix of multi-dimensional normal distribution of coordinate values of each grid, and initializing the iteration times;
s204: calculating to obtain a translation matrix according to the linear velocity of the mobile acquisition device in the current frame, and obtaining a rotation matrix according to the angular velocity of the mobile acquisition device in the current frame;
s205: for the point cloud to be registered, converting the points in the frame point cloud into the grid of the reference point cloud according to the rotation matrix and the translation matrix, and the method comprises the following steps:
the registered point cloud P1 is transformed into the mesh of the reference point cloud by a rotation matrix R and a translation matrix T according to the following formula:
P2=P1*R+T
wherein P1 is the registered point cloud before transformation, P2 is the transformed point cloud, and represents matrix multiplication;
s206: calculating the probability density of each conversion point according to the normal distribution parameters;
s207: averaging the probability densities of all the conversion points to obtain an NDT registration score;
s208: judging whether the NDT registration score reaches a preset convergence condition, if so, continuing to step S210, otherwise, continuing to step S209;
s209: optimizing the NDT registration score according to a Newton optimization algorithm, taking the sum of the probability densities of all the conversion points as an iterative target function to obtain updated linear velocity and angular velocity, then replacing the linear velocity and the angular velocity of the mobile acquisition device at the current frame with the updated linear velocity and angular velocity, adding one to the iteration times, and continuing to step S204;
s210: taking the current rotation matrix and translation matrix as the rotation matrix and translation matrix of the current frame, recording the rotation matrix and translation matrix of all frames, and superposing the original point cloud frame on the reference point cloud to obtain a point cloud map under a world coordinate system;
the step S300 of extracting the road network map from the high-precision map includes the following steps:
s301: extracting the pavement by using a region growing algorithm, and using the normal vector and the curvature as a communication basis for region growing;
s302: filtering the extracted point cloud of the road surface by adopting a preset light intensity threshold value, and keeping the point cloud of which the light intensity is greater than the preset light intensity threshold value to obtain the point cloud of the lane line;
s303: carrying out Euclidean distance clustering on the obtained point clouds of the lane lines to obtain clustered point clouds of all the lane lines;
s304: for each class, adopting straight line fitting to obtain the characteristic parameters (a, b, c, x) of each lane line1,y1,z1,x2,y2,z2) Wherein a, b and c are the straight line fitting results of the lane lines, x1,y1,z1Is a coordinate value, x, of an end point of the lane line2,y2,z2And the coordinate value of the other end point of the lane line.
2. The road network map generation method according to claim 1, wherein said step S203 of calculating a mean and a covariance matrix of a multidimensional normal distribution of each grid comprises the following steps:
obtaining coordinate values Pi (x) of each point in the current gridi,yi,zi) I belongs to (1, n), and n is the number of the middle points of the grid;
the mean value Pmean (x) of the multi-dimensional normal distribution of the current grid is calculated according to the following formulam,ym):
Figure FDA0003073036890000021
The covariance matrix of the current grid is calculated according to the following formula:
Figure FDA0003073036890000022
Figure FDA0003073036890000031
Figure FDA0003073036890000032
Figure FDA0003073036890000033
C=[cov(x,x),cov(x,y),cov(y,x),cov(y,y)]
wherein, the calculated C is the covariance matrix of the current grid.
3. The road network map generating method according to claim 1, wherein in step S204, a translation matrix is obtained by calculation according to the linear velocity of the mobile collection device in the current frame, and a rotation matrix is obtained according to the angular velocity of the mobile collection device in the current frame, comprising the steps of:
and calculating a rotation matrix R and a translation matrix T of the current frame according to the following formula:
T=[V(xv)*t,V(yv)*t,V(zv)*t];
R=[W(xw)*t,0,0
0,W(yw)*t,0
0,0,W(zw)*t]
where t is the interval duration between two frames, V (x)v,yv,zv) Linear velocity of the current frame, W (x)w,yw,zw) Is the angular velocity of the current frame.
4. The road network map generation method according to claim 2, wherein said step S206 of calculating the probability density of each transition point according to the normal distribution parameter comprises the steps of:
the probability density for each transition point is calculated according to the following formula:
Figure FDA0003073036890000034
ρ=(sqrt(C(0,1)+C(1,0)))2
wherein the coordinates of the conversion point are (x, y), mu1And mu2Respectively, the mean values Pmean (x) of the multi-dimensional normal distribution of the current gridm) And Pmean (y)m),σ1And σ2The values of C (0,0) and C (1,1), respectively.
5. The road network map generation method according to claim 1, wherein in step S208, determining whether the NDT registration score meets a preset convergence condition includes the following steps:
if the current iteration number reaches a preset iteration number N, judging whether the maximum fluctuation value of the NDT registration score in the iteration of nearly N times is larger than a preset fluctuation threshold value or not, if so, judging that the NDT registration score does not reach a preset convergence condition, otherwise, judging that the NDT registration score reaches the preset convergence condition;
and if the current iteration times do not reach the preset iteration times N, judging that the NDT registration score does not reach the preset convergence condition.
6. The road network map generation method according to claim 1, wherein in step S209, the NDT registration score is optimized according to a newton optimization algorithm to obtain updated linear velocity and angular velocity, and the method comprises the following steps:
calculating the sum of probability densities of all the conversion points
Figure FDA0003073036890000041
Where fi (x)i,yi) Taking the probability density of the ith conversion point, n is the number of the conversion points in the current grid, and the sum L is taken as an iterative objective function;
the angular velocity of the current frame is updated according to the following formula:
W(xw)k+1=W(xw)k–L'(W(xw)k)/L”(W(xw)k)
W(yw)k+1=W(yw)k–L'(W(yw)k)/L”(W(yw)k)
W(zw)k+1=W(zw)k–L'(W(zw)k)/L”(W(zw)k)
where k is the current iteration number, W (x)w)k、W(yw)kAnd W (z)w)kFor the angular velocity component before updating, W (x)w)k+1、W(yw)k+1And W (z)w)k+1For the updated angular velocity component, L' and L "are the first and second derivatives of the objective function, respectively;
updating the linear velocity of the current frame according to the following formula:
V(xv)k+1=V(xv)k–L'(V(xv)k)/L”(V(xv)k)
V(yv)k+1=V(yv)k–L'(V(yv)k)/L”(V(yv)k)
V(zv)k+1=V(zv)k–L'(V(zv)k)/L”(V(zv)k)
wherein, V (x)v)k、V(yv)kAnd V (z)v)kFor the linear velocity component before update, V (x)v)k+1、V(yv)k+1And V (z)v)k+1Is the updated linear velocity component.
7. The road network map generation method according to claim 1, further comprising, between steps S200 and S300, the steps of:
s211: and acquiring an image acquired by the camera, and adding RGB values to the point cloud map under the world coordinate system according to the pose transformation relation between the camera and the laser radar.
8. The road network map generation method according to claim 7, wherein said step S211 comprises the steps of:
acquiring an image acquired by a camera, and projecting a point obtained by laser radar measurement back to a two-dimensional coordinate system of the camera according to the pose transformation relation of the laser radar and the camera;
finding the RGB value of a pixel point closest to the projection point in the image collected by the camera, and taking the RGB value as the RGB value of the point obtained by current laser radar measurement;
and if the projection point is out of the image collected by the camera, setting the RGB value of the point measured by the current laser radar as a default value.
9. A road network map generation system for implementing the road network map generation method according to any one of claims 1 to 8, said system comprising:
the data acquisition module is used for acquiring the acquisition data of the laser radar, obtaining multi-frame point cloud data and acquiring the acquisition data of the inertia measurement unit corresponding to each frame;
the high-precision map generation module is used for generating a high-precision map according to the point cloud data and the acquired data of the inertial measurement unit;
and the road network map extraction module is used for extracting the road network map from the high-precision map.
10. A road network map generation device, characterized by comprising:
a processor;
a memory having stored therein executable instructions of the processor;
wherein the processor is configured to perform the steps of the road network map generation method of any one of claims 1 to 8 via execution of the executable instructions.
11. A computer-readable storage medium storing a program, wherein the program is configured to implement the steps of the road network map generation method according to any one of claims 1 to 8 when executed.
CN201810708643.XA 2018-07-02 2018-07-02 Road network map generation method, system, equipment and storage medium Active CN108871353B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810708643.XA CN108871353B (en) 2018-07-02 2018-07-02 Road network map generation method, system, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810708643.XA CN108871353B (en) 2018-07-02 2018-07-02 Road network map generation method, system, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN108871353A CN108871353A (en) 2018-11-23
CN108871353B true CN108871353B (en) 2021-10-15

Family

ID=64298053

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810708643.XA Active CN108871353B (en) 2018-07-02 2018-07-02 Road network map generation method, system, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN108871353B (en)

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111261016B (en) * 2018-11-30 2022-04-26 北京嘀嘀无限科技发展有限公司 Road map construction method and device and electronic equipment
CN109430984A (en) * 2018-12-12 2019-03-08 云南电网有限责任公司电力科学研究院 A kind of image conversion intelligent and safe helmet suitable for electric power place
CN111380529B (en) * 2018-12-28 2023-10-03 浙江菜鸟供应链管理有限公司 Mobile device positioning method, device and system and mobile device
CN109859137B (en) * 2019-02-14 2023-02-17 重庆邮电大学 Wide-angle camera irregular distortion global correction method
CN109814572B (en) * 2019-02-20 2022-02-01 广州市山丘智能科技有限公司 Mobile robot positioning and mapping method and device, mobile robot and storage medium
CN109917376B (en) * 2019-02-26 2021-08-06 东软睿驰汽车技术(沈阳)有限公司 Positioning method and device
CN116972880A (en) * 2019-03-18 2023-10-31 深圳市速腾聚创科技有限公司 Precision detection device of positioning algorithm
CN109740604B (en) * 2019-04-01 2019-07-05 深兰人工智能芯片研究院(江苏)有限公司 A kind of method and apparatus of running region detection
CN111829531A (en) * 2019-04-15 2020-10-27 北京京东尚科信息技术有限公司 Two-dimensional map construction method and device, robot positioning system and storage medium
CN110163930B (en) * 2019-05-27 2023-06-27 北京百度网讯科技有限公司 Lane line generation method, device, equipment, system and readable storage medium
CN110186468B (en) * 2019-05-27 2021-11-12 北京主线科技有限公司 High-precision map making method and device for automatic driving
WO2021007716A1 (en) * 2019-07-12 2021-01-21 Beijing Voyager Technology Co., Ltd. Systems and methods for positioning
WO2021012243A1 (en) * 2019-07-25 2021-01-28 Beijing Voyager Technology Co., Ltd. Positioning systems and methods
CN110633342B (en) * 2019-07-29 2022-06-17 武汉光庭信息技术股份有限公司 Lane topology network generation method
CN110728693B (en) * 2019-09-27 2022-12-23 上海维智卓新信息科技有限公司 Method and system for generating three-dimensional point cloud of large-scale driving road surface
CN111192341A (en) * 2019-12-31 2020-05-22 北京三快在线科技有限公司 Method and device for generating high-precision map, automatic driving equipment and storage medium
CN111540015A (en) * 2020-04-27 2020-08-14 深圳南方德尔汽车电子有限公司 Point cloud matching-based pose calculation method and device, computer equipment and storage medium
CN111539999A (en) * 2020-04-27 2020-08-14 深圳南方德尔汽车电子有限公司 Point cloud registration-based 3D map construction method and device, computer equipment and storage medium
CN114783214A (en) * 2020-04-28 2022-07-22 上海波若智能科技有限公司 Road network dynamic data acquisition method and road network dynamic data acquisition system
CN112936292B (en) * 2021-03-29 2022-05-24 昆明理工大学 Open-source slicing path planning robot arc additive manufacturing method
CN113639761B (en) * 2021-08-25 2023-11-10 吉林大学 Two-dimensional translation and rotation displacement and speed synchronous non-contact measurement method using black-and-white grid pattern code
CN114994673B (en) * 2022-08-04 2022-10-21 南京隼眼电子科技有限公司 Road map generation method and device for radar and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102538801A (en) * 2010-12-16 2012-07-04 上海博泰悦臻电子设备制造有限公司 Method and device for processing road network data in navigation map
CN102749083A (en) * 2012-07-17 2012-10-24 中国科学院遥感应用研究所 Method and device for constructing map realizing integration of multi-road network data
CN104679949A (en) * 2015-02-06 2015-06-03 中山大学 Method for creating Paramics road network based on XML (Extensive Markup Language) road network data

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8610708B2 (en) * 2010-09-22 2013-12-17 Raytheon Company Method and apparatus for three-dimensional image reconstruction

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102538801A (en) * 2010-12-16 2012-07-04 上海博泰悦臻电子设备制造有限公司 Method and device for processing road network data in navigation map
CN102749083A (en) * 2012-07-17 2012-10-24 中国科学院遥感应用研究所 Method and device for constructing map realizing integration of multi-road network data
CN104679949A (en) * 2015-02-06 2015-06-03 中山大学 Method for creating Paramics road network based on XML (Extensive Markup Language) road network data

Also Published As

Publication number Publication date
CN108871353A (en) 2018-11-23

Similar Documents

Publication Publication Date Title
CN108871353B (en) Road network map generation method, system, equipment and storage medium
CN109521403B (en) Parameter calibration method, device and equipment of multi-line laser radar and readable medium
KR102338270B1 (en) Method, apparatus, and computer readable storage medium for updating electronic map
CN111536964B (en) Robot positioning method and device, and storage medium
CN109345574B (en) Laser radar three-dimensional mapping method based on semantic point cloud registration
CN108279670B (en) Method, apparatus and computer readable medium for adjusting point cloud data acquisition trajectory
JP2019215853A (en) Method for positioning, device for positioning, device, and computer readable storage medium
CN108280866B (en) Road point cloud data processing method and system
CN111401146A (en) Unmanned aerial vehicle power inspection method, device and storage medium
CN109407073B (en) Reflection value map construction method and device
CN111429574A (en) Mobile robot positioning method and system based on three-dimensional point cloud and vision fusion
CN110969649B (en) Matching evaluation method, medium, terminal and device for laser point cloud and map
CN111707279B (en) Matching evaluation method, medium, terminal and device for laser point cloud and map
CN113096181A (en) Method and device for determining pose of equipment, storage medium and electronic device
CN111368860B (en) Repositioning method and terminal equipment
Brightman et al. Point cloud registration: A mini-review of current state, challenging issues and future directions
CN111376249B (en) Mobile equipment positioning system, method and device and mobile equipment
CN113673288A (en) Idle parking space detection method and device, computer equipment and storage medium
CN110853098A (en) Robot positioning method, device, equipment and storage medium
CN116071721A (en) Transformer-based high-precision map real-time prediction method and system
CN111462321B (en) Point cloud map processing method, processing device, electronic device and vehicle
CN114842074A (en) Unmanned aerial vehicle image positioning method based on model matching
CN115063760A (en) Vehicle travelable area detection method, device, equipment and storage medium
CN114943766A (en) Relocation method, relocation device, electronic equipment and computer-readable storage medium
CN111435086B (en) Navigation method and device based on splicing map

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: Room 503-3, 398 Jiangsu Road, Changning District, Shanghai 200050

Patentee after: Shanghai Xijing Technology Co.,Ltd.

Address before: Room 503-3, 398 Jiangsu Road, Changning District, Shanghai 200050

Patentee before: SHANGHAI WESTWELL INFORMATION AND TECHNOLOGY Co.,Ltd.

CP01 Change in the name or title of a patent holder