CN114705199A - Lane-level fusion positioning method and system - Google Patents

Lane-level fusion positioning method and system Download PDF

Info

Publication number
CN114705199A
CN114705199A CN202210356880.0A CN202210356880A CN114705199A CN 114705199 A CN114705199 A CN 114705199A CN 202210356880 A CN202210356880 A CN 202210356880A CN 114705199 A CN114705199 A CN 114705199A
Authority
CN
China
Prior art keywords
positioning
lane
vehicle
point
coarse
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210356880.0A
Other languages
Chinese (zh)
Inventor
万满
任凡
王宽
杨钊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Changan Automobile Co Ltd
Original Assignee
Chongqing Changan Automobile Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Changan Automobile Co Ltd filed Critical Chongqing Changan Automobile Co Ltd
Priority to CN202210356880.0A priority Critical patent/CN114705199A/en
Publication of CN114705199A publication Critical patent/CN114705199A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Navigation (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

The invention relates to a lane-level fusion positioning method and a system. The invention carries out positioning initialization through a GPS, carries out rough dead reckoning on the basis of the initialization subsequently, obtains a section of high-precision map of the surrounding area through the obtained rough positioning result, matches the information of the number, the type, the color and the like of lane lines of the high-precision map with the corresponding information of the lane lines identified by a forward-looking camera to obtain the lane of the vehicle on the high-precision map, and finally carries out lane-level precise registration by utilizing the historical driving track point of the vehicle and the center line of the lane on the high-precision map within a period of time, thus obtaining the lane-level positioning result with higher precision.

Description

Lane level fusion positioning method and system
Technical Field
The invention belongs to the technical field of automatic driving fusion positioning, and particularly relates to a lane-level fusion positioning method and system.
Background
Positioning is a key technology for realizing automatic driving. The positioning of combining the high accuracy map makes the autopilot obtain super perception's ability, has greatly promoted autopilot's security and efficiency. In the automatic driving perception fusion technology, positioning fusion is a necessary process for realizing automatic control of vehicles. The current positioning scheme of the automatic driving vehicle mainly comprises combined inertial navigation (GNSS and INS) positioning, laser radar positioning, visual positioning and combined positioning integrating different devices. Traditional single sensor or simple combination positioning schemes tend to be less accurate and less robust. For fusion positioning, for example, patent with publication number CN106767853A (application number 201611261781.5), a method for high-precision positioning of unmanned vehicles based on multi-information fusion, proposes a fusion positioning scheme based on vision, GNSS/INS, laser radar and high-precision maps, which has good stability and high positioning precision, but the expensive laser radar also makes the automatic driving delay under the scheme unable to be produced in mass.
The existing positioning method mainly has the following defects:
1) the positioning precision of a positioning scheme or a simple combined positioning scheme depending on a single sensor is not high and the robustness is not good;
2) the positioning scheme combining the fusion of multiple sensors such as a high-precision map, a laser radar and the like has a good positioning effect, but has higher cost and is difficult to apply to mass-produced vehicle models in a short period.
Disclosure of Invention
In view of the above-mentioned deficiencies of the prior art, the technical problem to be solved by the present invention is to provide a lane-level fusion positioning method and system, which avoid the problem of high cost for achieving the positioning performance required by control.
In order to solve the technical problems, the invention adopts the following technical scheme:
a lane-level fusion positioning method specifically comprises the following steps:
s1: initial positioning, namely adopting three frames of initial positioning in a positioning initialization module to obtain an initial positioning result and a local high-precision map of a first frame when the lane-level fusion positioning system starts to operate, judging whether the initial positioning result is effective, if so, indicating that the initialization is successful, and executing S2; if the initialization is invalid, continuously carrying out initialization positioning and judgment until the initialization is successful;
s2: calculating the coarse track, namely calculating the coarse positioning position of the vehicle by using the speed information of the vehicle in a coarse track calculation module based on the initial positioning result, and acquiring a local high-precision map around each frame of the vehicle through the coarse positioning position;
s3: coordinate conversion, namely converting the global GPS coordinates into Cartesian coordinates under the coordinate system of the vehicle through a coordinate conversion module;
s4: lane matching, namely determining the lane where the vehicle is located by using the local high-precision map of the current frame, and numbering the lanes of the map section matched with the visual lane line on the local high-precision map to obtain the lane number of the lane where the vehicle is located;
s5: performing lane-level fine registration positioning, correcting a coarse positioning result obtained by coarse dead reckoning of the current frame of the step S2 by using the lane number obtained in the step S4, outputting the corrected positioning result, and executing step S6;
s6: coordinate inverse conversion, in which the corrected positioning result obtained in step S5 is converted into global GPS coordinates in a coordinate inverse conversion module and output; and simultaneously inputting the GPS coordinates into a track coarse estimation module to correct the coarse positioning result of step S2.
Further improving the above technical solution, in step S1, determining whether the initial positioning result is valid for the difference between the positioning results of three consecutive frames, and if the difference between the positioning results of three consecutive frames is not great, determining that the initial positioning result is valid; if the difference is too large, it is deemed invalid.
Further, the step S2 is based on the assumption that the poses of the host vehicle in the front and rear frames in the course rough estimation module do not show great changes;
then, the vehicle speed v, the course angular velocity omega and the turning radius r of the vehicle are calculated by the following formula:
Figure BDA0003575731170000021
Figure BDA0003575731170000022
Figure BDA0003575731170000023
wherein v isrIs the speed of the right rear wheel and has the unit of m/s; v. oflIs the speed of the left rear wheel and the unit is m/s; l is the rear wheel track;
and then calculating the pose of the vehicle at the current moment t by the following formula:
xt=xt-1+Δt*v*cos(θt)
yt=yt-1+Δt*v*sin(θt)
θt=θt-1+Δt*ω
wherein x and y are positions, and theta is a heading angle.
Further, the global GPS coordinates in step S3 are based on the coarse positioning result of the vehicle in each frame obtained in step S2, to obtain a corresponding local high-precision map, where the lane lines and lane center line points in the local high-precision map are the global GPS coordinates.
Further, the step S4 further includes:
s4.1: obtaining the section of the vehicle on the local high-precision map of the current frame through the comparison of the number of lane lines according to the local high-precision map of the current frame obtained in the steps S1 and S2;
s4.2: comparing the transverse distance between two lane lines of the vehicle, the types and the colors of the two lane lines on the local high-precision map of the current frame with corresponding quantities given by the front-view camera, so as to determine the lane of the vehicle on the local high-precision map of the current frame and complete lane matching;
s4.3: and numbering the lanes obtained in the step S4.2, numbering the lanes sequentially from left to right, and outputting a lane number N of the lane where the vehicle is located.
Further, in the step S5,
acquiring a track point, which is closest to the vehicle, of the center line of the vehicle channel on the current frame local high-precision map through the lane number N, adding the track point into a calculation queue Q1, and adding a coarse positioning point obtained by coarse dead reckoning of the track calculated in the step S2 into a calculation queue Q2;
fixing the capacity of the calculation queues Q1 and Q2, deleting the track point and the rough positioning point which are firstly input into the calculation queues each time when Q1 and Q2 are full, and respectively adding the point of the current frame to the tail of the calculation queues;
and calculating according to the track points in the calculation queue Q1 and the rough positioning point in the Q2 to obtain a correction quantity of the rough calculation of the track, and correcting the rough positioning result of the rough calculation of the current frame track according to the correction quantity to obtain a corrected positioning result.
Further, the capacity sizes of the calculation queues Q1 and Q2 are both 10;
namely, track points, which are closest to the vehicle, of the center line of the vehicle road on 10 historical local high-precision maps are stored and recorded as a point set A; marking the vehicle rough positioning point with 10 historical frames as a point set B;
and calculating the correction quantity of the roughly estimated rough positioning point of the flight path according to the following formula:
Figure BDA0003575731170000031
Figure BDA0003575731170000032
wherein theta is a rotation amount, t is a translation amount,
Figure BDA0003575731170000033
is the x coordinate of the ith frame trace point of point set a,
Figure BDA0003575731170000034
is the y coordinate of the ith frame trace point of point set a,
Figure BDA0003575731170000035
the x coordinate of the i-th frame coarse positioning point of point set B,
Figure BDA0003575731170000036
the y coordinate of the rough locating point for the ith frame of point set B,
Figure BDA0003575731170000037
is the x-coordinate of the center point of point set a,
Figure BDA0003575731170000038
is the y coordinate of the center point of point set a,
Figure BDA0003575731170000039
is the x-coordinate of the center point of point set B,
Figure BDA00035757311700000310
is the y coordinate of the center point of the point set B, and R is the rotation matrix.
Further, in step S2, if the global GPS positioning point from the coordinate inverse transformation module is valid, the coarse positioning result in the track coarse estimation module may be corrected by the global GPS positioning point.
The invention also relates to a lane-level fusion positioning system, comprising:
the positioning initialization module is used for performing positioning initialization after the automatic driving function is started, and acquiring an initial result of the whole positioning system and a local high-precision map when the operation starts;
the track rough calculation module is used for calculating a rough positioning position of the vehicle according to the speed of the vehicle on the basis of the initial result, and acquiring a local high-precision map around the vehicle at each moment according to the rough positioning position;
the coordinate conversion module is used for converting global GPS coordinates into local Cartesian coordinates under the coordinate system of the vehicle;
the lane matching module is used for determining the lane of the vehicle on the local high-precision map at each moment;
the lane-level fine registration positioning module is used for correcting a coarse positioning result obtained by coarse dead reckoning of the current frame to obtain a final positioning result;
and the coordinate inverse conversion module is used for converting the local Cartesian coordinates under the coordinate system of the vehicle into global GPS positioning coordinates, and is the inverse operation of the coordinate conversion module.
Compared with the prior art, the invention has the following beneficial effects:
1. the lane-level fusion positioning method provided by the invention assumes that the automatic driving vehicle generally runs along the center line of the road, and the situation is the same. Therefore, positioning initialization can be carried out through a GPS, track rough calculation is carried out on the basis of the initialization subsequently, a section of surrounding high-precision map is obtained through the obtained rough positioning result, information such as the number, the type and the color of lane lines of the high-precision map is matched with corresponding information of the lane lines identified by the forward-looking camera, the lane where the vehicle is located on the high-precision map is obtained, finally lane-level precise registration is carried out on the historical driving track point of the vehicle and the center line of the lane where the vehicle is located on the high-precision map within a period of time, and the lane-level positioning result with higher precision can be obtained.
2. The lane-level fusion positioning system provided by the invention adopts a low-cost sensor, realizes lane-level high-precision positioning through modules such as positioning initialization, track rough reckoning, conversion from GPS to local Cartesian coordinates, lane matching, lane-level precise registration, reverse conversion from Cartesian local coordinates to GPS global coordinates and the like, and can be used for mass production vehicles.
Drawings
Fig. 1 is a flowchart of a lane-level fusion positioning method according to an embodiment.
Detailed Description
The following describes embodiments of the present invention in further detail with reference to the accompanying drawings.
The lane-level fusion positioning system of the specific embodiment comprises:
the positioning initialization module is used for performing positioning initialization after the automatic driving function is started, and acquiring an initial result of the whole positioning system and a local high-precision map when the operation starts;
the track coarse calculation module is used for calculating a coarse positioning position of the vehicle according to the speed of the vehicle on the basis of the initial result, and acquiring a local high-precision map around the vehicle at each moment according to the coarse positioning position;
the coordinate conversion module is used for converting global GPS coordinates into local Cartesian coordinates under the coordinate system of the vehicle;
the lane matching module is used for determining the lane of the vehicle on the local high-precision map at each moment;
the lane-level fine registration positioning module is used for correcting a coarse positioning result obtained by coarse dead reckoning of the current frame to obtain a final positioning result;
and the coordinate inverse conversion module is used for converting the local Cartesian coordinates under the coordinate system of the vehicle into global GPS positioning coordinates, and is the inverse operation of the coordinate conversion module.
The lane-level fusion positioning system of the embodiment adopts a low-cost sensor, realizes lane-level high-precision positioning through modules such as positioning initialization, track rough calculation, GPS to local Cartesian coordinate conversion, lane matching, lane-level fine registration, Cartesian local coordinate to GPS global coordinate inverse conversion and the like, and can be used for mass production vehicles.
Referring to fig. 1, the present invention further provides a lane-level fusion positioning method, which is performed based on the above lane-level fusion positioning system, and specifically includes the following steps:
s1: initial positioning, namely adopting three frames of initial positioning in a positioning initialization module to obtain an initial positioning result and a local high-precision map of a first frame when the lane-level fusion positioning system starts to operate, judging whether the initial positioning result is effective, if so, indicating that the initialization is successful, and executing S2; if the initialization is invalid, continuously carrying out initialization positioning and judgment until the initialization is successful;
s2: calculating the coarse track, namely calculating the coarse positioning position of the vehicle by using the speed information of the vehicle in a coarse track calculation module based on the initial positioning result, and acquiring a local high-precision map around each frame of the vehicle through the coarse positioning position;
s3: coordinate conversion, namely converting the global GPS coordinates into Cartesian coordinates under the coordinate system of the vehicle through a coordinate conversion module;
s4: lane matching, namely determining the lane where the vehicle is located by using the local high-precision map of the current frame, and numbering the lanes of the map section matched with the visual lane line on the local high-precision map to obtain the lane number of the lane where the vehicle is located;
s4.1: obtaining the section of the vehicle on the local high-precision map of the current frame through the comparison of the number of lane lines according to the local high-precision map of the current frame obtained in the steps S1 and S2;
s4.2: comparing the transverse distance between two lane lines of the vehicle, the types and the colors of the two lane lines on the local high-precision map of the current frame with corresponding quantities given by the front-view camera, so as to determine the lane of the vehicle on the local high-precision map of the current frame and complete lane matching;
s4.3: numbering the lanes obtained in the step S4.2, numbering the lanes sequentially from left to right, and outputting a lane number N of the lane where the vehicle is located;
s5: performing lane-level fine registration positioning, correcting a coarse positioning result obtained by coarse dead reckoning of the current frame of the step S2 by using the lane number obtained in the step S4, outputting the corrected positioning result, and executing step S6;
s6: coordinate inverse conversion, in which the corrected positioning result obtained in step S5 is converted into global GPS coordinates in a coordinate inverse conversion module and output; and simultaneously inputting the GPS coordinates into a track coarse estimation module to correct the coarse positioning result of step S2.
In practice, the present invention assumes that the autonomous vehicle is traveling generally along the center line of the roadway, as is the case. Therefore, positioning initialization can be carried out through a GPS, track rough calculation is carried out on the basis of the initialization subsequently, a section of surrounding high-precision map is obtained through the obtained rough positioning result, information such as the number, the type and the color of lane lines of the high-precision map is matched with corresponding information of the lane lines identified by the forward-looking camera, the lane where the vehicle is located on the high-precision map is obtained, finally lane-level precise registration is carried out on the historical driving track point of the vehicle and the center line of the lane where the vehicle is located on the high-precision map within a period of time, and the lane-level positioning result with higher precision can be obtained.
Further, it can be understood here that the local high-precision map of the current frame mentioned in step S4.1 should cover the local high-precision maps mentioned in steps S1 and S2, and if it is the first frame where the present lane-level fusion positioning system starts to operate, the required local high-precision map should be the local high-precision map in step S1; if the system is operating, it is the local high-precision map of each frame in step S2.
Please refer to fig. 1, wherein in step S1, it is determined whether the initial positioning result is valid for the difference between the positioning results of three consecutive frames, and if the difference between the positioning results of three consecutive frames is not great, the initial positioning result is determined to be valid; if the difference is too large, it is deemed invalid.
In this way, subsequent positioning algorithms can be run.
Wherein, the step S2 is based on the assumption that the poses of the front and back frames of the host vehicle in the track rough estimation module do not find great changes;
then, the vehicle speed v, the course angular velocity omega and the turning radius r of the vehicle are calculated by the following formula:
Figure BDA0003575731170000061
Figure BDA0003575731170000062
Figure BDA0003575731170000063
wherein v isrIs the speed of the right rear wheel with the unit of m/s; v. oflIs the wheel speed of the left and rear wheels, and has the unit of m/s(ii) a l is the rear wheel track;
and then calculating the pose of the vehicle at the current moment t by the following formula:
xt=xt-1+Δt*v*cos(θt)
yt=yt-1+Δt*v*sin(θt)
θt=θt-1+Δt*ω
wherein x and y are positions, and theta is a heading angle.
Thus, the lane in which the host vehicle is located is calculated for subsequent lane matching.
The global GPS coordinates in step S3 are obtained based on the coarse positioning result of each frame of the vehicle obtained in step S2, and the lane lines and lane center line points in the local high-precision map are the global GPS coordinates.
In practice, the local high-precision map obtained here has the selected range of 150m in front of the vehicle and 50m behind the vehicle.
Wherein, in the step S5,
acquiring a track point, which is closest to the vehicle, of the center line of the vehicle channel on the current frame local high-precision map through the lane number N, adding the track point into a calculation queue Q1, and adding a coarse positioning point obtained by coarse dead reckoning of the track calculated in the step S2 into a calculation queue Q2;
fixing the capacity of the calculation queues Q1 and Q2, deleting the track point and the rough positioning point which are firstly input into the calculation queues each time when Q1 and Q2 are full, and respectively adding the point of the current frame to the tail of the calculation queues;
and calculating according to the track points in the calculation queue Q1 and the rough positioning point in the Q2 to obtain a correction quantity of the rough calculation of the track, and correcting the rough positioning result of the rough calculation of the current frame track according to the correction quantity to obtain a corrected positioning result.
When the method is implemented, the sizes of the fixed calculation queues Q1 and Q2 are both 10, namely track points, closest to the vehicle, of the center line of the vehicle road on 10 historical local high-precision maps are stored and recorded as a point set A; marking the vehicle rough positioning point with 10 historical frames as a point set B; and calculating the correction quantity of the rough positioning point of the rough estimation of the flight path according to the following formula:
Figure BDA0003575731170000071
Figure BDA0003575731170000072
wherein, theta is the rotation amount, t is the translation amount,
Figure BDA0003575731170000073
is the x coordinate of the ith frame trace point of point set a,
Figure BDA0003575731170000074
is the y coordinate of the ith frame trace point of point set a,
Figure BDA0003575731170000075
the x coordinate of the rough positioning point for the ith frame of point set B,
Figure BDA0003575731170000076
the y coordinate of the rough locating point for the ith frame of point set B,
Figure BDA0003575731170000077
is the x coordinate of the center point of point set a (i.e. the mean of the x coordinates of all points of point set a),
Figure BDA0003575731170000078
is the y coordinate of the center point of point set a (i.e. the mean of the y coordinates of all points of point set a),
Figure BDA0003575731170000079
is the x coordinate of the center point of point set B (i.e. the mean of the x coordinates of all points of point set B),
Figure BDA00035757311700000710
y-coordinate (i.e. point) of the center point of point set BThe mean of the y coordinates of all points of set B), R is the rotation matrix.
And (4) performing rotational translation on the coarse positioning result of the coarse dead reckoning of the current frame through the correction quantity to obtain a corrected positioning result, and outputting the corrected positioning result to a coordinate inverse conversion module for coordinate inverse conversion.
In step S2, if the global GPS positioning point from the coordinate inverse transformation module is valid, the coarse positioning result in the track coarse estimation module may be corrected by the global GPS positioning point.
Finally, the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting, although the present invention has been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made to the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention, and all of them should be covered in the claims of the present invention.

Claims (9)

1. A lane level fusion positioning method is characterized in that: the method specifically comprises the following steps:
s1: initial positioning, namely adopting three frames of initial positioning in a positioning initialization module to obtain an initial positioning result and a local high-precision map of a first frame when the lane-level fusion positioning system starts to operate, judging whether the initial positioning result is effective, if so, indicating that the initialization is successful, and executing S2; if the initialization is invalid, continuously carrying out initialization positioning and judgment until the initialization is successful;
s2: calculating the coarse track, namely calculating the coarse positioning position of the vehicle by using the speed information of the vehicle in a coarse track calculation module based on the initial positioning result, and acquiring a local high-precision map around each frame of the vehicle through the coarse positioning position;
s3: coordinate conversion, namely converting the global GPS coordinates into Cartesian coordinates under the coordinate system of the vehicle through a coordinate conversion module;
s4: lane matching, namely determining the lane where the vehicle is located by using the local high-precision map of the current frame, and numbering the lanes of the map section matched with the visual lane line on the local high-precision map to obtain the lane number of the lane where the vehicle is located;
s5: performing lane-level fine registration positioning, correcting a coarse positioning result obtained by coarse dead reckoning of the current frame of the step S2 by using the lane number obtained in the step S4, outputting the corrected positioning result, and executing step S6;
s6: coordinate inverse conversion, in which the corrected positioning result obtained in step S5 is converted into global GPS coordinates in a coordinate inverse conversion module and output; and simultaneously inputting the GPS coordinates into a track coarse estimation module to correct the coarse positioning result of step S2.
2. The lane-level fusion positioning method according to claim 1, characterized in that: in step S1, it is determined whether the initial positioning result is valid for the difference between the positioning results of three consecutive frames, and if the positioning results of three consecutive frames are not much different, it is determined to be valid; if the difference is too large, it is deemed invalid.
3. The lane-level fusion positioning method according to claim 1, characterized in that: the step S2 is based on the assumption that the poses of the front and rear frames of the host vehicle in the course rough estimation module do not show a great change;
then, the vehicle speed v, the course angular velocity omega and the turning radius r of the vehicle are calculated by the following formula:
Figure FDA0003575731160000011
Figure FDA0003575731160000012
Figure FDA0003575731160000013
wherein v isrIs the speed of the right rear wheel and has the unit of m/s; v. oflThe wheel speed of the left rear wheel is in m/s; l is the rear wheel track;
and then calculating the pose of the vehicle at the current moment t by the following formula:
xt=xt-1+Δt*v*cos(θt)
yt=yt-1+Δt*v*sin(θt)
θt=θt-1+Δt*ω
wherein x and y are positions, and theta is a heading angle.
4. The lane-level fusion positioning method according to claim 1, characterized in that: the global GPS coordinates in step S3 are based on the coarse positioning result of the vehicle in each frame obtained in step S2, to obtain a corresponding local high-precision map, where the lane lines and lane center line points in the local high-precision map are the global GPS coordinates.
5. The lane-level fusion positioning method according to claim 1, characterized in that: the step S4 further includes:
s4.1: obtaining the section of the vehicle on the local high-precision map of the current frame through the comparison of the number of lane lines according to the local high-precision map of the current frame obtained in the steps S1 and S2;
s4.2: comparing the transverse distance between two lane lines of the vehicle, the types and the colors of the two lane lines on the local high-precision map of the current frame with corresponding quantities given by the front-view camera, so as to determine the lane of the vehicle on the local high-precision map of the current frame and complete lane matching;
s4.3: and numbering the lanes obtained in the step S4.2, numbering the lanes sequentially from left to right, and outputting a lane number N of the lane where the vehicle is located.
6. The lane-level fusion positioning method according to claim 5, characterized in that: in the above-mentioned step S5, the step,
acquiring a track point, which is closest to the vehicle, of the center line of the vehicle channel on the current frame local high-precision map through the lane number N, adding the track point into a calculation queue Q1, and adding a coarse positioning point obtained by coarse dead reckoning of the track calculated in the step S2 into a calculation queue Q2;
fixing the capacity of the calculation queues Q1 and Q2, deleting the track point and the rough positioning point which are firstly input into the calculation queues each time when Q1 and Q2 are full, and respectively adding the point of the current frame to the tail of the calculation queues;
and calculating according to the track points in the calculation queue Q1 and the rough positioning point in the Q2 to obtain a correction quantity of the rough calculation of the track, and correcting the rough positioning result of the rough calculation of the current frame track according to the correction quantity to obtain a corrected positioning result.
7. The lane-level fusion positioning method according to claim 6, wherein: the capacity of the calculation queues Q1 and Q2 are both 10;
namely, track points, which are closest to the vehicle, of the center line of the vehicle road on 10 historical local high-precision maps are stored and recorded as a point set A; marking the vehicle rough positioning point with 10 historical frames as a point set B;
and calculating the correction quantity of the roughly estimated rough positioning point of the flight path according to the following formula:
Figure FDA0003575731160000021
Figure FDA0003575731160000022
wherein, theta is the rotation amount, t is the translation amount,
Figure FDA0003575731160000023
is the x coordinate of the ith frame trace point of point set a,
Figure FDA0003575731160000024
is the y coordinate of the ith frame trace point of point set a,
Figure FDA0003575731160000031
the x coordinate of the i-th frame coarse positioning point of point set B,
Figure FDA0003575731160000036
the y coordinate of the rough locating point for the ith frame of point set B,
Figure FDA0003575731160000032
is the x-coordinate of the center point of point set a,
Figure FDA0003575731160000034
is the y coordinate of the center point of point set a,
Figure FDA0003575731160000035
is the x-coordinate of the center point of point set B,
Figure FDA0003575731160000033
is the y coordinate of the center point of the point set B, and R is the rotation matrix.
8. The lane-level fusion positioning method according to claim 6, wherein: in step S2, if the global GPS positioning point from the coordinate inverse transformation module is valid, the coarse positioning result in the track coarse estimation module may be corrected by the global GPS positioning point.
9. The utility model provides a lane level fuses positioning system which characterized in that: the method comprises the following steps:
the positioning initialization module is used for performing positioning initialization after the automatic driving function is started, and acquiring an initial result of the whole positioning system and a local high-precision map when the operation starts;
the track coarse calculation module is used for calculating a coarse positioning position of the vehicle according to the speed of the vehicle on the basis of the initial result, and acquiring a local high-precision map around the vehicle at each moment according to the coarse positioning position;
the coordinate conversion module is used for converting global GPS coordinates into local Cartesian coordinates under the coordinate system of the vehicle;
the lane matching module is used for determining the lane of the vehicle on the local high-precision map at each moment;
the lane-level fine registration positioning module is used for correcting a coarse positioning result obtained by coarse dead reckoning of the current frame to obtain a final positioning result;
and the coordinate inverse conversion module is used for converting the local Cartesian coordinates under the coordinate system of the vehicle into global GPS positioning coordinates, and is the inverse operation of the coordinate conversion module.
CN202210356880.0A 2022-03-31 2022-03-31 Lane-level fusion positioning method and system Pending CN114705199A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210356880.0A CN114705199A (en) 2022-03-31 2022-03-31 Lane-level fusion positioning method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210356880.0A CN114705199A (en) 2022-03-31 2022-03-31 Lane-level fusion positioning method and system

Publications (1)

Publication Number Publication Date
CN114705199A true CN114705199A (en) 2022-07-05

Family

ID=82173359

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210356880.0A Pending CN114705199A (en) 2022-03-31 2022-03-31 Lane-level fusion positioning method and system

Country Status (1)

Country Link
CN (1) CN114705199A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115061479A (en) * 2022-08-03 2022-09-16 国汽智控(北京)科技有限公司 Lane relation determination method and device, electronic equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115061479A (en) * 2022-08-03 2022-09-16 国汽智控(北京)科技有限公司 Lane relation determination method and device, electronic equipment and storage medium
CN115061479B (en) * 2022-08-03 2022-11-04 国汽智控(北京)科技有限公司 Lane relation determination method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN109946732B (en) Unmanned vehicle positioning method based on multi-sensor data fusion
US11307040B2 (en) Map information provision system
US11024055B2 (en) Vehicle, vehicle positioning system, and vehicle positioning method
US11802769B2 (en) Lane line positioning method and apparatus, and storage medium thereof
CN109945858B (en) Multi-sensing fusion positioning method for low-speed parking driving scene
CN107246868B (en) Collaborative navigation positioning system and navigation positioning method
CN102208035B (en) Image processing system and position measuring system
CN108628324B (en) Unmanned vehicle navigation method, device, equipment and storage medium based on vector map
Lee et al. Development of a self-driving car that can handle the adverse weather
WO2022147924A1 (en) Method and apparatus for vehicle positioning, storage medium, and electronic device
CN114396957B (en) Positioning pose calibration method based on vision and map lane line matching and automobile
JP4596566B2 (en) Self-vehicle information recognition device and self-vehicle information recognition method
CN113920198B (en) Coarse-to-fine multi-sensor fusion positioning method based on semantic edge alignment
CN112819711B (en) Monocular vision-based vehicle reverse positioning method utilizing road lane line
CN113252022A (en) Map data processing method and device
CN111413990A (en) Lane change track planning system
JP7418196B2 (en) Travel trajectory estimation method and travel trajectory estimation device
Hara et al. Vehicle localization based on the detection of line segments from multi-camera images
CN113566817B (en) Vehicle positioning method and device
CN113405555B (en) Automatic driving positioning sensing method, system and device
CN114705199A (en) Lane-level fusion positioning method and system
JP6790951B2 (en) Map information learning method and map information learning device
CN112710301B (en) High-precision positioning method and system for automatic driving vehicle
CN115790613A (en) Visual information assisted inertial/odometer integrated navigation method and device
CN112530270B (en) Mapping method and device based on region allocation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination