CN113822124A - Lane level positioning method, device, equipment and storage medium - Google Patents

Lane level positioning method, device, equipment and storage medium Download PDF

Info

Publication number
CN113822124A
CN113822124A CN202110693316.3A CN202110693316A CN113822124A CN 113822124 A CN113822124 A CN 113822124A CN 202110693316 A CN202110693316 A CN 202110693316A CN 113822124 A CN113822124 A CN 113822124A
Authority
CN
China
Prior art keywords
lane
lane line
line
vehicle
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110693316.3A
Other languages
Chinese (zh)
Inventor
肖宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202110693316.3A priority Critical patent/CN113822124A/en
Publication of CN113822124A publication Critical patent/CN113822124A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/29Graphical models, e.g. Bayesian networks
    • G06F18/295Markov models or related models, e.g. semi-Markov models; Markov random fields; Networks embedding Markov models

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the application relates to a lane-level positioning method, a lane-level positioning device, lane-level positioning equipment and a storage medium, and is suitable for the fields of maps, navigation, automatic driving, internet of vehicles, intelligent transportation, cloud computing and the like. The method comprises the following steps: acquiring first lane data of a first road section where a vehicle is located at a first moment and second lane data of a second road section where the vehicle is located at a second moment, wherein the second moment is the last moment of the first moment; acquiring a lane image in front of a vehicle at a first moment, and determining lane line information of each lane line in the lane image; and determining the actual lane where the vehicle is located at the first moment based on the first lane data, the second lane data and the lane line information of each lane line in the lane image. By adopting the embodiment of the application, the positioning efficiency and the applicability of lane-level positioning can be improved.

Description

Lane level positioning method, device, equipment and storage medium
Technical Field
The present application relates to the field of transportation, and in particular, to a lane-level positioning method, apparatus, device, and storage medium.
Background
In the related art, when lane-level positioning is implemented, lane-level positioning can be implemented by a Real-time Kinematic (RTK) carrier-phase differential technology based on high-precision map data. However, the method has high dependence on measuring equipment and insufficient stability, and the positioning efficiency is low due to a large amount of time consumed by the acquisition and data processing of high-precision map data.
In the related art, when lane-level positioning is realized, a vehicle is also tracked by means of position recognition by a sensor, ranging by a laser radar, and 3D-cloud feature scanning, so that vehicle positioning is realized. However, such methods are relatively costly and have relatively low applicability.
Therefore, it is necessary to improve the positioning efficiency and applicability of the lane-level positioning.
Disclosure of Invention
The embodiment of the application provides a lane-level positioning method, a lane-level positioning device, lane-level positioning equipment and a storage medium, and can improve the positioning efficiency and the applicability of lane-level positioning.
In one aspect, an embodiment of the present application provides a lane-level positioning method, where the method includes:
acquiring first lane data of a first road section where a vehicle is located at a first moment and second lane data of a second road section where the vehicle is located at a second moment, wherein the second moment is a previous moment of the first moment;
acquiring a lane image in front of the vehicle at a first moment, and determining lane line information of each lane line in the lane image;
and determining the actual lane where the vehicle is located at the first time based on the first lane data, the second lane data and lane line information of each lane line in the lane image.
In another aspect, an embodiment of the present application provides a lane-level positioning apparatus, including:
the system comprises a lane data acquisition module, a lane data acquisition module and a lane data acquisition module, wherein the lane data acquisition module is used for acquiring first lane data of a first road section where a vehicle is located at a first moment and second lane data of a second road section where the vehicle is located at a second moment, and the second moment is the previous moment of the first moment;
the lane image acquisition module is used for acquiring a lane image in front of the vehicle at a first moment and determining lane line information of each lane line in the lane image;
and the lane positioning module is used for determining the actual lane where the vehicle is located at the first moment based on the first lane data, the second lane data and the lane line information of each lane line in the lane image.
In another aspect, an embodiment of the present application provides an electronic device, including a processor and a memory, where the processor and the memory are connected to each other;
the memory is used for storing computer programs;
the processor is configured to execute the lane-level positioning method provided by the embodiment of the application when the computer program is called.
In another aspect, the present application provides a computer-readable storage medium, which stores a computer program, where the computer program is executed by a processor to implement the lane-level positioning method provided by the present application.
In another aspect, embodiments of the present application provide a computer program product or a computer program, which includes computer instructions stored in a computer-readable storage medium. The processor of the electronic device reads the computer instructions from the computer-readable storage medium, and executes the computer instructions, so that the computer device executes the lane-level positioning method provided by the embodiment of the application.
In the embodiment of the application, the lane line information of each lane line in front of the road section where the first time is located can be determined by obtaining the lane image in front of the vehicle at the first time, and then the actual lane of the first vehicle at the first time is determined based on the lane line information of each lane line in front of the road section at the first time and the lane data of the road section at different times, so that the positioning efficiency of lane-level positioning is improved. And the lane data of the road section where the vehicle is located and the lane image in front of the vehicle can be obtained in real time based on the method and the device, so that the actual lane where the vehicle is located at each moment can be determined in real time, and the applicability is high.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic view of a scene of a lane-level positioning method provided in an embodiment of the present application;
FIG. 2 is a schematic flow chart of a lane-level positioning method according to an embodiment of the present disclosure;
FIG. 3 is another schematic flow chart of a lane-level positioning method provided in an embodiment of the present application;
FIG. 4 is a schematic diagram of a scenario for determining a lane line combination according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a vehicle coordinate system according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of a method for determining a lane line distance according to an embodiment of the present disclosure;
FIG. 7a is a schematic diagram illustrating a scenario of determining a transition probability matrix according to an embodiment of the present application;
FIG. 7b is a schematic diagram of another scenario for determining a transition probability matrix according to an embodiment of the present application;
FIG. 8 is a flowchart of a lane-level positioning method provided in an embodiment of the present application;
FIG. 9 is a schematic structural diagram of a lane-level positioning device provided in an embodiment of the present application;
fig. 10 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The lane-level positioning method provided by the embodiment of the application can be suitable for the fields of maps, navigation, automatic driving, Intelligent Vehicle control, internet of vehicles, Intelligent transportation, cloud computing and the like, such as Intelligent Traffic Systems (ITS) and Intelligent vehicular access coordination Systems (IVICS) in the transportation field.
The Intelligent Transportation System is a comprehensive Transportation System which effectively and comprehensively applies advanced scientific technologies (information technology, computer technology, data communication technology, sensor technology, electronic control technology, automatic control theory, operation research, artificial intelligence and the like) to Transportation, service control and vehicle manufacturing and strengthens the relation among vehicles, roads and users, thereby ensuring safety, improving efficiency, improving environment and saving energy. Based on the lane-level positioning method provided by the embodiment of the application, the travelling lane of the lane in the road can be determined, so that powerful guarantee is provided for the aspects of transportation, service control and the like.
The intelligent vehicle-road cooperative system is a development direction of an Intelligent Transportation System (ITS). The vehicle-road cooperative system adopts the advanced wireless communication, new generation internet and other technologies, implements vehicle-vehicle and vehicle-road dynamic real-time information interaction in all directions, develops vehicle active safety control and road cooperative management on the basis of full-time dynamic traffic information acquisition and fusion, fully realizes effective cooperation of human and vehicle roads, ensures traffic safety, improves traffic efficiency, and thus forms a safe, efficient and environment-friendly road traffic system. The lane-level positioning method provided by the embodiment of the application can provide technical support for traffic safety and vehicle-road cooperation based on the lane-level positioning of the vehicle.
Referring to fig. 1, fig. 1 is a scene schematic diagram of a lane-level positioning method provided in an embodiment of the present application. As shown in fig. 1, during the driving of the vehicle 100, first lane data of a first road segment 200 on which the vehicle 100 is located at a first time and second lane data of a second road segment 300 on which the vehicle 100 is located at a second time, which is a previous time to the first time, may be acquired. That is, during the driving of the vehicle 100, the first track data of the road section where the vehicle 100 is located at any one time and the second track data of the road section at the last time of the vehicle 100 at any one time can be acquired.
Further, in the process of acquiring the first lane data and the second lane data of the vehicle 100, the lane image 400 in front of the vehicle 100 at the first time may also be acquired. The lane image 400 in front of the vehicle 100 is a road image including lane information in front of the vehicle 100 on the first road segment 200 at the first time.
Further, lane line information of each lane line in the lane image 400 may be determined, and an actual lane where the vehicle 100 is located in the first road section 200 at the first time may be determined based on the first lane data, the second lane data, and the lane line information.
The lane-level positioning method provided by the embodiment of the present application may be implemented by an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, a cloud server providing cloud computing services, and the like, or by a terminal such as a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart watch, a vehicle-mounted terminal, and a smart television, which is not limited herein.
Referring to fig. 2, fig. 2 is a schematic flow chart of the lane-level positioning method according to the embodiment of the present application. As shown in fig. 2, the lane-level positioning method provided in the embodiment of the present application may include the following steps:
step S21, obtaining first road data of a first road segment where the vehicle is located at a first time and second road data of a second road segment where the vehicle is located at a second time.
In some possible embodiments, when the vehicle runs on the road, the lane data of the road section where the vehicle is located at each time can be obtained in real time, the first time is any time when the vehicle runs on the road, and the second time is the last time of the first time. The time span corresponding to the first time and the second time may be determined based on the actual application scene requirement, which is not limited herein.
The lane data of any road segment may be obtained based on the road data corresponding to the road segment, and the lane data of any road segment includes, but is not limited to, the number of lanes, the lane direction, the lane topology (lane distribution), the lane color, and the lane type (solid line, dotted line, etc.), and the like.
The road data may be existing map data, Advanced Driving Assistance System (ADAS) data, internet of vehicles data, and the like, and is not limited herein. The ADAS is an active safety technology that collects environmental data inside and outside a vehicle at a first time by using various sensors mounted on the vehicle, and performs technical processes such as identification, detection, and tracking of static and dynamic objects, so that a driver can perceive a possible danger at the fastest time to attract attention and improve safety.
Specifically, taking a first time (current time) as an example, when acquiring the lane data of the road segment where the vehicle is located at the first time, the positioning information of the vehicle at the first time may be determined, and then the first road segment where the vehicle is located at the first time may be determined based on the positioning information of the vehicle at the first time, and then the first lane data of the first road segment may be acquired.
The positioning information of the vehicle at the first time may be determined based on satellite positioning information, or based on vehicle control information, vehicle visual perception information, Inertial Measurement Unit (IMU) information, and the like, and based on the information, the positioning information of the vehicle at the first time is output through a certain algorithm.
After the first road segment where the vehicle is located at the first moment is determined, the first road data corresponding to the first road segment can be further acquired from the road data corresponding to the first road segment based on the positioning information of the vehicle at the first moment.
The positioning information includes, but is not limited to, longitude and latitude information, and a specific driving distance of the vehicle in each road, such as a distance of 300 meters from the road junction a, which is not limited herein.
Further, the previous road segment of the vehicle in the first road segment may be determined as a second road segment where the vehicle is located at a second time, and then second road data corresponding to the second road segment is obtained from the road data corresponding to the second road segment.
Optionally, the positioning information of the vehicle at the second time may also be determined based on the positioning information of the vehicle at the first time and the driving information of the vehicle during driving, such as vehicle control information, vehicle driving speed, and the like, and then the second road section where the vehicle is located at the second time may be determined based on the positioning information of the vehicle at the second time, and then the second lane data of the second road section may be obtained.
In the embodiment of the application, the road data and the lane data corresponding to each road segment may be obtained from a database, a database management system, or a block chain, and the like, and may be specifically determined based on the requirements of the actual application scene, which is not limited herein. The blockchain is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, a consensus mechanism and an encryption algorithm. The blockchain is essentially a decentralized database, which is a series of data blocks associated by using cryptography, and each data block is used for storing road data of each road segment.
And step S22, acquiring a lane image in front of the vehicle at the first moment, and determining lane line information of each lane line in the lane image.
In some possible embodiments, the image of the lane in front of the vehicle at the first time includes an image of relevant information of all or part of the lane in the first road section, and may be acquired by an image acquisition device of the vehicle. For example, the real-time images can be obtained through a front windshield of the vehicle, a roof of the vehicle, a camera device of a head of the vehicle, and a vehicle event data recorder of the vehicle, and the specific obtaining mode can be determined based on the requirements of the actual application scene, which is not limited herein.
Further, the lane image in front of the vehicle at the first time is processed by techniques such as image processing, visual recognition, and machine learning, and the lane lines in the lane image and the lane line information of each lane line are obtained. The lane line information of each lane line in the lane image includes, but is not limited to, lane line color, lane line type, lane line distribution information, and lane line number, and is not limited herein.
Specifically, after the lane lines in the lane image are identified, for each lane line, the lane line type of the lane line can be determined to be the confidence of the existing lane line type, and then the lane line type with the highest confidence is determined to be the lane line type of the lane line. Similarly, for each lane line, the confidence that the lane line color of the lane line is the existing lane line color in each lane line can be determined, and then the lane line color with the highest confidence is determined as the lane line color of the lane line.
The lane line type is an actual lane line presentation mode in the road, and includes but is not limited to a single solid line, a single dotted line, a double solid line, a double dotted line, a left virtual right real, a left real right virtual and the like. Meanwhile, in the embodiment of the present application, through technologies such as image processing and visual recognition, guard rails, curbs, road edges, and the like of roads in the lane image may be determined as lane lines in the lane image.
The lane line color is an actual lane line color in the road, and includes, but is not limited to, yellow, white, blue, green, gray, black, and the like. The lane line type is a non-actual lane line type such as a guard rail, a curb, a road edge, etc., and the lane line color thereof may be determined to be a preset color different from the colors of other actual lane lines, which is not limited herein.
And step S23, determining the actual lane where the vehicle is located at the first moment based on the first lane data, the second lane data and the lane line information of each lane line in the lane image.
In some possible embodiments, determining the actual lane in which the vehicle is located at the first moment based on the first lane data, the second lane data, and the lane line information may be implemented based on a hidden markov model, see in particular fig. 3. Fig. 3 is another schematic flow chart of the lane-level positioning method according to the embodiment of the present application. As shown in fig. 3, determining the actual lane in which the vehicle is located at the first time specifically includes the following steps:
step S31, determining a first probability that each lane of the first road segment is within the field of view of the vehicle when on the second road segment, based on the first road data and the lane line information of each lane line in the lane image.
In some possible embodiments, based on the first lane data and lane line information, a transmit probability matrix corresponding to determining an actual lane in which the vehicle is located at the first time based on the hidden markov model may be determined. Wherein any element in the emission probability matrix represents a first probability that one lane of the first road segment is within the field of view of the vehicle when in the second road segment. The emission probability matrix represents a first probability that each lane of the first road segment is within the visual field of the vehicle when the vehicle travels the second road segment at the second time.
When determining the first probability corresponding to each lane of the first road segment, the lane line information of each lane line of the first road segment may be determined based on the first lane data, and then the first probability that each lane of the first road segment is within the view field of the vehicle at the second road segment is determined based on the number of lane lines in the lane image, the lane line information of each lane line in the lane image, and the lane line information of each lane line of the first road segment.
The lane lines of the first road section can also be determined based on the first lane data. The lane line information of each lane line of the first road segment includes a lane line type and a lane line color of each lane line of the first road segment, that is, the lane line information of each lane line of the first road segment is an actual lane line type and an actual lane line color of each lane line of the first road segment.
In some possible embodiments, when determining the first probability that the lanes of the first road segment are within the visual field of the vehicle in the second road segment based on the number of lane lines in the lane image, the lane line information of each lane line in the lane image, and the lane line information, the lane line combination corresponding to each lane of the first road segment may be determined first.
Each lane line combination comprises lane lines corresponding to the corresponding lanes, and the number of the lane lines in the lane line combination is the same as that of the lane lines in the lane images. That is, for each lane in the first road segment, a combination of lane lines having the same number of lane lines as the number of lane lines in the lane image may be formed based on the two lane lines corresponding to the lane (i.e., the two lane lines constituting the lane) and the other lane lines in the first road segment.
For each lane in the first road segment, a lane line combination corresponding to the lane may be formed based on two lane lines (a first lane line and a second lane line) corresponding to the lane, and a lane line adjacent to the first lane line and/or a lane line adjacent to the second lane line, and the number of lane lines in the lane line combination is the same as the number of lane lines in the lane image.
Referring to fig. 4, fig. 4 is a schematic view of a scene for determining a lane line combination according to an embodiment of the present application. Fig. 4 shows five lanes in the first link, which are formed by six lane lines of lane lines L1, L2, L3, R1, R2, and R3, and the number of lane lines in the lane image is 4. Based on this, for the first lane in the first link, a combination of lane lines corresponding to the first lane may be configured together based on the lane lines corresponding to the first lane (lane lines L1 and R1) and the lane line L2 adjacent to the lane line L1 and the lane line R2 adjacent to the lane line R1.
For the second lane in the first link, a combination of lane lines corresponding to the second lane may be configured based on the lane lines corresponding to the second lane (lane lines R1 and R2) and the lane line L1 adjacent to the lane line R1 and the lane line R3 adjacent to the lane line R2.
Further, for each lane line combination, the matching degree of each lane line in the lane image of the lane line combination may be determined based on the information of each lane line in the lane line combination and the information of each lane line in the lane image, and then the first probability that the lane corresponding to the lane line combination is within the view range of the vehicle at the second road segment may be determined based on the matching degree corresponding to the lane line combination.
Specifically, for each lane line combination, when determining the overall matching degree between the lane line combination and each lane line in the lane image, a lane line combination image may be constructed based on the lane line combination and the lane line information of each lane line in the lane line combination, and then the image similarity between the lane line combination image and the lane image is determined, and then the image similarity is determined as the overall matching degree between the lane line combination and each lane line in the lane image. As an example, the matching degree corresponding to the lane line combination may be determined as a first probability corresponding to a lane corresponding to the lane line combination.
Optionally, for each lane line combination, when determining the overall matching degree between the lane line combination and each lane line in the lane image, if the lane line information of each lane line of the first road segment includes a lane line type, it may be determined that the lane line type in the lane line combination is the same as the lane line type of the corresponding lane line in the lane image, and then the overall matching degree between the lane line combination and each lane line in the lane image is determined based on the lane line with the same lane line type. As an example, the overall type matching degree between the lane line combination and each lane line in the lane image may be determined as the first probability corresponding to the lane line combination.
If the lane line type of each lane line in the lane line combination is the same as the lane line type of the opposite lane line in the lane image, the overall type matching degree of the lane line combination and each lane line in the lane image can be determined to be 1, and further the overall matching degree of the lane line combination and each lane line in the lane image can be determined based on the overall type matching degree. As an example, the overall type matching degree or the product of the overall type matching degree and the overall type weight may be determined as determining the matching degree of the combination of lane lines and the entirety of the lane lines in the lane image.
And if the types of the lane lines of the partial lane lines in the lane line combination are the same as the types of the lane lines of the opposite lane lines in the lane image, determining the overall type matching degree of each lane line in the lane line combination and the lane image based on the corresponding relation between the number of the lane lines with the same lane line type and the overall type matching degree. As an example, if the lane line types of half the number of lane lines in the lane line combination are the same as the lane line types of the opposite lane lines in the lane image, the overall type matching degree of the lane line combination and each lane line in the lane image is determined to be 0.5 based on the correspondence between the number of lane lines of the same lane line type and the overall type matching degree, and further the overall type matching degree of 0.5 or the product of the overall type matching degree of 0.5 and the overall type weight may be determined to be the overall matching degree of the lane line combination and each lane line in the lane image.
Optionally, for each lane line combination, when determining the overall matching degree between the lane line combination and each lane line in the lane image, if the lane line information of each lane line of the first road segment includes a lane line color, a lane line in the lane line combination with the same lane line color as that of the corresponding lane line in the lane image may be determined, and the overall color matching degree between the lane line combination and each lane line in the lane image may be determined based on the lane line with the same lane line color. As an example, the overall color matching degree between the lane line combination and each lane line in the lane image may be determined as the first probability corresponding to the lane line combination.
If the lane line color of each lane line in the lane line combination is the same as the lane line color of the opposite lane line in the lane image, the overall color matching degree of the lane line combination and each lane line in the lane image can be determined to be 1, and further the overall matching degree of the lane line combination and each lane line in the lane image can be determined based on the overall color matching degree. As an example, the overall color matching degree or the product of the overall color matching degree and the overall color weight may be determined as determining the matching degree of the combination of lane lines and the entirety of the lane lines in the lane image.
If the lane line color of a part of lane lines in the lane line combination is the same as the lane line color of the opposite lane lines in the lane image, the overall color matching degree of each lane line in the lane line combination and the lane image is determined based on the corresponding relation between the number of the lane lines with the same lane line color and the overall color matching degree. As an example, if the lane line colors of half the number of lane lines in the lane line combination are the same as the lane line colors of the opposite lane lines in the lane image, the overall color matching degree of the lane line combination and each lane line in the lane image is determined to be 0.5 based on the correspondence between the number of lane lines having the same lane line color and the overall color matching degree, and further the overall color matching degree of 0.5 or the product of the overall color matching degree of 0.5 and the overall color weight may be determined to be the overall matching degree of the lane line combination and each lane line in the lane image.
Optionally, for each lane line combination, when determining the overall matching degree between the lane line combination and each lane line in the lane image, if the lane line information of each lane line of the first road segment includes a lane line color and a lane line type, the overall color matching degree and the overall type matching degree between the lane line combination and each lane line in the lane image may be determined based on the above implementation manner. And further determining the overall matching degree of the lane group combination and each lane line in the lane graph based on the overall color matching degree and the overall type matching degree of the lane line combination and each lane line in the lane image. As an example, the product of the overall color matching degree and the overall type matching degree may be determined as a first probability of determining the matching degree of the combination of lane lines and the entirety of the lane lines in the lane image, and then determining the matching degree of the combination of lane lines and the entirety of the lane lines in the lane image as the corresponding lane.
As an example, the calculation process of the matching degree between the combination of the lane lines corresponding to the lane s and the whole of each lane line in the lane image is as follows:
calcMatchProb(line_type_obs,line_type_real,line_color_obs,line_color_real);
the calcmchprob () is a function of the matching degree of the combination of the lane lines and the entirety of each lane line in the lane image, line _ type _ obs represents the lane line type of the lane line in the lane image, line _ type _ real represents the lane line type of the lane line in the lane group, line _ color _ obs represents the lane line color of the lane line in the lane image, and line _ color _ real represents the lane line color of the lane line in the lane group.
The specific calculation process of the function may determine the implementation manner of the overall matching degree of the lane line combination and each lane line in the lane image through the overall color matching degree and the overall type matching degree, which is not limited herein.
In some possible embodiments, for each lane line combination, the matching degree of the lane line combination with each lane line in the lane image includes the matching degree of each lane line in the lane line combination with the corresponding lane line in the lane image, and then the lane line weight corresponding to each lane line in the lane image is determined based on the lane line information of each lane line in the lane image. Based on this, the first probability that the lane corresponding to the lane line combination is in the view range of the vehicle in the second road segment can be determined based on the matching degree of each lane line in the lane line combination and the corresponding lane line in the lane image and the lane line weight corresponding to each lane line in the lane image.
As an example, if the emission matrix corresponding to each lane in the first road segment is emisionprob, the matrix is a matrix with 1 row and N columns, N represents the number of lanes in the first road segment, and one element in the matrix represents a first probability corresponding to one lane in the first road segment. In the case that the number of lane lines in the lane image is 4, a weight matrix of the lane line weight corresponding to each lane line in the lane image is a matrix with 1 row and 4 columns, one element in the weight matrix represents the weight of 1 lane line in the lane image when the emission probability is calculated, and the sum of the weights in the weight matrix is 1. Based on this, for the lane s in the first road segment, the first probability corresponding to the lane is:
emmisionProb[s]=HL1*leftQ2+HL2*leftQ1+HR1*rightQ1+HR2*rightQ2。
wherein, leftQ1, leftQ2, rightQ1 and rightQ2 respectively represent the matching degree of the lane lines L1, L2, R1 and R2 corresponding to the lane s and the corresponding lane lines in the lane image, the lane lines L1 and R1 are two lane lines corresponding to the lane s respectively, H is HL1Represents the lane line weight, H, corresponding to the lane line L1 in the weight matrixL2Represents the lane line weight, H, corresponding to the lane line L2 in the weight matrixR1Represents the lane line weight, H, corresponding to the lane line R1 in the weight matrixR2Representing the lane line weights in the weight matrix probTable corresponding to lane line R2.
Specifically, in a case where the lane line information of each lane line in the first road segment includes a lane line type and a lane line color, determining, for each lane line combination, a matching degree of each lane line in the lane line combination with a corresponding lane line in the lane image based on the lane line information of each lane line in the lane line combination and the lane line information of each lane line in the lane image includes:
determining the type matching degree of the lane line type of each lane line in the lane line combination and the lane line type of the corresponding lane line in the lane image;
determining the color matching degree of the lane line color of each lane line in the lane line combination and the lane line color of the corresponding lane line in the lane image;
and determining the matching degree of each lane line in the lane line combination and the corresponding lane line in the lane image based on the type matching degree and the color matching degree corresponding to each lane line in the lane line combination.
As an example, for each lane line in each lane line combination, the product of the type matching degree and the color matching degree corresponding to the lane line may be determined as the matching degree of the lane line with the corresponding lane line in the lane image.
For the lane line in the first road segment, the position of the lane line in the first road segment is different, and there is a difference in the probability that it appears within the visual field range of the vehicle when it is in the second road segment. Therefore, the relative distribution position of each lane line in the lane image relative to the first road section can be determined based on the number of lane lines of the leftmost and rightmost lane lines in the lane image, the first distance between the leftmost two lane lines and the second distance between the rightmost two lane lines, so as to determine the lane line weight corresponding to each lane line based on the relative distribution position.
Specifically, a first distance between two leftmost lane lines in the lane image and a second distance between two rightmost lane lines in the lane image are determined, and a road edge line in the lane image is determined based on lane line information of each lane line in the lane image. The relative distribution position of each lane line in the lane image can be further determined by determining a first distance between the two leftmost lane lines in the lane image and a second distance between the two rightmost lane lines.
When determining whether the lane image has the road edge line, the determination may be performed based on the lane line type and/or the lane line color of each lane line in the lane image, for example, when the lane image includes a lane line of a preset lane line type, the lane line may be determined as the road edge line, and/or when the lane image includes a lane line of a preset lane line color, the lane line may be determined as the road edge line. Alternatively, it may be recognized whether a guard rail, a curb, or the like exists in the lane image based on a technique such as image recognition, and if so, it may be determined that a road edge line exists in the lane image.
Further, if the leftmost lane line and the rightmost lane line in the lane image are both road edge lines, or neither the left lane line nor the rightmost lane line is a road edge line, determining each preset weight in the first weight combination as a lane line weight corresponding to each lane line in the lane image;
if the first distance is smaller than a first threshold value, the leftmost lane line is a road edge line, and the rightmost lane line is a non-road edge line, determining each preset weight in the second weight combination as a lane line weight corresponding to each lane line in the lane image;
if the first distance is greater than or equal to a first threshold value, and the leftmost lane line is a road edge line and the rightmost lane line is a non-road edge line, determining each preset weight in the third weight combination as a lane line weight corresponding to each lane line in the lane image;
if the second distance is smaller than a second threshold value, and the rightmost lane line is a road edge line and the leftmost lane line is a non-road edge line, determining each preset weight in the fourth weight combination as a lane line weight corresponding to each lane line in the lane image;
and if the second distance is greater than or equal to a second threshold value, and the rightmost lane line is a road edge line and the leftmost lane line is a non-road edge line, determining each preset weight in the fifth weight combination as a lane line weight corresponding to each lane line in the lane image.
The number of the preset weights in any weight combination is consistent with the number of the lane lines in the lane image, and one preset weight in any weight combination corresponds to one lane line in the lane image. The sum of the preset weights in any weight combination is 1.
As an example, in the case where the number of lane lines in the lane image is 4(L2, L1, R1, R2), each weight combination may be represented by a 5-row 4-column matrix probTable [5] [4], where each row represents one weight combination, and each column sequentially represents the lane line weight corresponding to the lane line L2, L1, R1, R2 from left to right, then:
probTable[5][4]={
{0.24,0.26,0.26,0.24},// both the leftmost lane line and the rightmost lane line are road edge lines, or neither the left lane line nor the rightmost lane line is a road edge line;
{0.50,0.20,0.20,0.10},// the first distance is smaller than a first threshold, and the leftmost lane line is a road edge line and the rightmost lane line is a non-road edge line;
{0.27,0.28,0.24,0.21},// the first distance is greater than or equal to a first threshold, and the leftmost lane line is a road edge line and the rightmost lane line is a non-road edge line;
{0.10,0.20,0.20,0.50},// second distance is smaller than a second threshold, and the rightmost lane line is a road edge line and the leftmost lane line is a non-road edge line;
{0.21,0.24,0.28,0.27},// second distance is greater than or equal to a second threshold, and the rightmost lane line is a road edge line and the leftmost lane line is a non-road edge line;
}。
in some possible embodiments, the first distance between the leftmost two lane lines and the second distance between the rightmost two lane lines in the lane image may be determined by constructing a coordinate system. Taking the determination of the first distance between the two leftmost lane lines in the lane image as an example, a Vehicle Coordinate System (VCS) may be constructed first based on the positioning information of the vehicle at the first time. Wherein the vehicle coordinate system is used for describing a special three-dimensional moving coordinate system O-XYZ of the vehicle motion. The vehicle coordinate system can be constructed based on a left-hand system, a right-hand system and the like, and the origin of the coordinate system can be a vehicle head midpoint, a front axle midpoint or a rear axle midpoint and the like, which is not limited herein.
As an example, referring to fig. 5, fig. 5 is a schematic diagram of a vehicle coordinate system according to an embodiment of the present application. As shown in fig. 5, the origin O of the vehicle coordinate system is fixed relative to the vehicle position, the vehicle center of mass is taken, and when the vehicle is in a stationary state on a horizontal road surface, the X axis is directed to the front of the vehicle parallel to the ground, the Y axis is directed to the left side of the vehicle, and the Z axis is directed to the upper side of the vehicle through the vehicle center of mass.
Further, the two leftmost lane lines may be subjected to inverse perspective transformation, and then transformed from the image coordinates to the world coordinates, and then the transformed lane lines are subjected to fitting reconstruction, so as to obtain a lane line equation in which the two leftmost lane lines correspond to a vehicle coordinate system (hereinafter referred to as a first coordinate system for convenience of description). The lane line equations of the leftmost two lane lines corresponding to the first coordinate system may be 2 nd-order polynomial, 3 rd-order polynomial, or other expression forms, which is not limited herein. As an example, the lane line equation where the two leftmost lane lines correspond to the first coordinate system may be y ═ d + a × + b ×2+c*x3Or y + d + a x + b x2. Wherein a, b, c, d are constants, and can be determined based on the actual lane line, which is not limited herein.
Further, a first distance between the leftmost two lane lines may be determined based on a lane line equation in which the leftmost two lane lines correspond to the first coordinate system. For example, the intercept of each lane line equation on a certain coordinate axis can be respectively determined, and then the first distance is determined based on the two intercepts. Referring to fig. 6, fig. 6 is a schematic diagram of a method for determining a lane line distance according to an embodiment of the present application. The x-axis and y-axis of the first coordinate system are shown in fig. 6, and the lane line equations for the two leftmost lane lines are y1 and y2, respectively. Taking the intercept of lane line equations y1 and y2 on the y-axis as D1 and D2, respectively, | D1-D2| can be determined as the first distance between the leftmost two lane lines.
Step S32, determining a second probability of the vehicle being in each lane of the second road segment at the first time based on the first and second lane data, if the vehicle is in each lane of the second road segment at the second time.
In some possible embodiments, based on the first lane data corresponding to the first road segment and the second lane data corresponding to the second road segment, a transition probability matrix corresponding to the actual lane where the vehicle is located at the first time based on the hidden markov model may be determined. Wherein any element in the transition probability matrix represents a second probability that the vehicle is in one lane of the second road segment at the second time, and in one lane of the first road segment at the first time. The transition probability matrix represents a second probability that the vehicle is in each lane of the second road segment at the second time and is in each lane of the first road segment at the first time.
Specifically, lane line information of each lane line of the first road section and lane distribution information of each lane of the first road section may be determined based on the first lane data, and lane line information of each lane line of the second road section and lane distribution information of each lane of the second road section may be determined based on the second lane data. Wherein, the lane distribution information may be used to describe the number of lanes in the corresponding road section, the distribution (topology) of each lane, and the like, and the lane line information includes, but is not limited to, the actual color and the actual type of the lane line, and the like.
Further, the target lane may be determined based on lane line information of each lane line of the first and second road segments and lane line distribution information of each lane. The target lane is the same lane in the first road section and the second road section, namely the target lane penetrates through the first road section and the second road section.
In the case where the vehicle is in the target lane of the second road segment at the second time, since the target lane is the same lane in the first road segment and the second road segment, the probability that the vehicle is in the target lane of the first road segment at the first time is large. Based on this, the preset probability may be determined as a second probability of the vehicle being in the target lane of the second road segment at the first time in the case that the vehicle is in the target lane of the second road segment at the second time. The preset probability is the maximum probability in the transition probability matrix, and may be 1, for example.
As an example, in a case where the vehicle is in the target lane of the second road segment at the second time, the second probability that the vehicle is in the target lane of the first road segment at the first time may be determined as the preset probability 1.
Further, for the vehicle, the vehicle mostly travels along the same lane, so for any lane in the second road segment (hereinafter referred to as a third lane for convenience of description) and a fourth lane different from the third lane in the first road segment, the relative position of the fourth lane in the first road segment is distant from the relative position of the third lane in the second road segment, and the lane line type (such as a lane line type for which lane change is prohibited) corresponding to the fourth lane and the third lane, and the like, which may determine the difficulty of the vehicle traveling to the fourth lane corresponding to the first time when the vehicle is in the third lane at the second time.
Based on this, different lanes in the first road section and the second road section may be determined based on the respective lane line information of the first road section and the second road section, and the lane distribution information of the respective lanes. And further determining, based on the lane line information of each lane line in the first road segment and the second road segment, and the lane distribution information of each lane, a second probability that the vehicle is in the target lane of the second road segment at the first time, in a case where the vehicle is in the target lane of the second road segment at the second time, in the second probability that the vehicle is in each other lane of the first road segment at the first time except the target lane, and in a case where the vehicle is in each other lane of the second road segment at the second time except the target lane, in the second probability that the vehicle is in the each other lane of the first road segment at the first time.
The vehicle is more likely to travel to the fourth lane corresponding to the first time when the vehicle is in the third lane at the second time, and the second probability that the vehicle is in the fourth lane at the first time is higher when the vehicle is in the third lane at the second time if the number of lane intervals between the third lane and the fourth lane is smaller. And if the vehicle cannot send the third lane to the fourth lane, the second probability that the vehicle is in the fourth lane at the first moment is almost 0.
Referring to fig. 7a, fig. 7a is a schematic view of a scenario for determining a transition probability matrix according to an embodiment of the present application. As shown in fig. 7a, the second road segment of the vehicle at the second time instant includes lane B1, lane B2, and lane B3, and the first road segment of the vehicle at the first time instant includes lane a1, lane a2, and lane A3. If the vehicle can travel to the adjacent other lanes in the first road section in any lane of the second road section, and lane B1 and lane a1, lane B2 and lane a2, lane B3 and lane A3 are the same lane, the second probability of the vehicle being in the target lane of the first road section at the first time is 1/(| m-n | +1) if the vehicle is in any lane of the second road section at the second time. Wherein m is a row index of the probability matrix corresponding to each lane in the second road section, and n is a column index of the probability matrix corresponding to each lane in the first road section.
As in fig. 7a, the vehicle is in lane B1 at the second time, the second probability of being in lane a1 at the first time is 1, the second probability of being in lane a2 at the first time is 1/2, and the second probability of being in lane A3 at the first time is 1/3; the vehicle is in the B2 lane at the second time, the second probability of being in the a1 lane at the first time is 1/2, the second probability of being in the a2 lane at the first time is 1, and the second probability of being in the A3 lane at the first time is 1/2; the vehicle is in the B3 lane at the second time, the second probability of being in the a1 lane at the first time is 1/3, the second probability of being in the a2 lane at the first time is 1/2, and the second probability of being in the A3 lane at the first time is 1.
Referring to fig. 7b, fig. 7b is a schematic diagram of another scenario for determining a transition probability matrix according to an embodiment of the present application. As shown in fig. 7a, the second road segment of the vehicle at the second time instant includes lane B1, lane B2, and lane B3, and the first road segment of the vehicle at the first time instant includes lane a1, lane a2, lane A3, and lane a 4. If the vehicle is able to travel to the adjacent other lanes in the first road segment in any lane of the second road segment, lane B1 and lane a2, lane B2 and lane A3, lane B3 and lane a4 are the same lane.
If the lane B1, the lane a1, the lane B1 and the lane a2 are the same lane, that is, the lane B1 and the lane a1 penetrate through, and the lane B1 and the lane a2 penetrate through, the preset probability 1 may be determined as the second probability that the vehicle is in the lane a1 and the lane a2 at the first time when the vehicle is in the lane B1 at the second time, and the other probability value 0 (which may be other probability values smaller than 1) may be determined as the second probability that the vehicle is in the lane A3 and the lane a4 at the first time.
If the lane B2 and the lane A3 are through, when the vehicle is in the lane B2 at the second time, the second probability of being in the lane A3 at the first time is 1, and the second probability of being in the other lane in the first road section at the first time is 0; if the lane B3 and the lane a4 pass through, when the vehicle is in the B3 lane at the second time, the second probability of being in the a4 lane at the first time is 1, and the second probability of being in the other lane in the first link at the first time is 0.
The transition probability matrix shown in fig. 7b may be determined based on the second probability of the vehicle being in each lane of the second road segment at the second time instant being in the first road segment at the first time instant.
And step S33, determining a third probability of the vehicle being in each lane of the second road section at the second moment, and determining the actual lane of the vehicle being in the first moment based on the first probability, the second probability and the third probability.
In some possible embodiments, after determining the third probability, and the transmission probability matrix and the transition probability matrix, that the vehicle is in the lanes of the second road segment at the second time, a fourth probability that the vehicle will travel from the lanes of the second road segment to each lane of the first road segment may be determined based on the third probability, the transmission probability, and the transition probability matrix.
For each lane of the first road segment, determining a maximum probability of the sixth probabilities that the vehicle traveled from the lanes of the second road segment to the lane as a fifth probability that the vehicle was in the lane in the first road segment at the first time. Specifically, the determination can be made by the viterbi algorithm:
W(j)=maxi=1…N′{prevW(i)*transMatrix(i,j)}*emissionProb[j];
where prevw (i) is a third probability matrix, where an element in the matrix represents a third probability that the vehicle is in a lane of the second road segment at the second time instant, and i is an index of the lanes in the second road segment. the transmrix (i, j) represents a transition probability matrix, one element of which represents a second probability of a lane j being on the first road segment at the first moment in case the vehicle is on a lane i of the second road segment at the second moment in time, j being an index of each lane in the first road segment. emisionprob [ j ] represents an emission probability matrix, one element of which represents a first probability that a lane j of a first road segment is within a field of view of a vehicle when in a second road segment.
Where N' represents the number of lanes of the second road segment, and based on the above formula, one element in w (j) represents a fifth probability that the vehicle is in lane j in the first road segment at the first time.
Further, after obtaining a fifth probability that the vehicle is in each lane in the first road segment at the first time, passing argmaxj=1…N{ W (j) } determining the lane corresponding to the maximum probability in the fifth probabilities that the vehicle is located in the lanes in the first road section at the first moment as the actual lane where the vehicle is located in the first road section at the first moment. And N is the number of lanes in the first road section.
And if the first time is the initial time of the lane-level positioning, the third probability of the vehicle in each lane of the second road section at the second time is the initial probability distribution corresponding to the hidden Markov model. The third probability that the vehicle is located in each lane of the second road segment at the second moment is the same, and the third probability may be specifically determined based on the number of lanes in each lane in the first road segment. The third probability is 1/N for each lane, e.g., the vehicle is in the second road segment at the second time.
If the first time is any time except the initial time in the lane-level positioning process, the third probability w (i) of each lane of the second road section that the vehicle is located at the second time can be determined based on the viterbi algorithm. For example, upon determining that the vehicle is on a first road at a first timeWhen the vehicle is determined to be in the actual lane of the third route at the third time (the time next to the first time) after the actual lane of the route, w (k) max may be determined based oni=1…N{prevW(j)*transMatrix(j,k)}*emissionProb[k]And determining sixth probabilities W (k) that the vehicle is located in the lanes in the third road section at the third moment, and further determining the lane corresponding to the maximum probability in the sixth probabilities W (k) as the actual lane where the vehicle is located in the third road section at the third moment. Wherein w (j) is a fifth probability that the vehicle is in each lane in the first road segment at the first time, and N is the number of lanes in the first road segment.
Based on the above implementation, the actual lane in which the vehicle is located at any one time may be determined.
It should be noted that, if the first time is the initial time of the lane-level positioning, the first road segment of the vehicle at the first time may be determined as the second road segment where the vehicle is located at the second time, and the first track data of the first road segment of the vehicle at the first time is determined as the second track data of the second road segment where the vehicle is located at the second time, that is, in the case where the first time is the initial time of the lane-level positioning, the vehicle is considered to be located on the same road segment at the first time and the second time.
And if the second lane data of the second road section where the vehicle is located at the second moment are not acquired, determining that the first moment is the initial moment of lane-level positioning. For example, the lane data of any road segment may be obtained from the ADAS data, and the vehicle may be outside the ADAS data coverage area at the second time, such as when the vehicle first enters the ADAS data coverage area at the first time, travels out of the ADAS data coverage area halfway, and re-travels at the first time, and the like, the first time may be determined as the initial time of the lane-level positioning. Or, if it is determined that the vehicle enters a new intersection, a ramp, or the like at the first time based on the first lane data of the first road segment at the first time, the first time may also be determined as an initial time of lane-level positioning, and then the actual lane where the vehicle is located at the first road segment at the first time is determined based on the above implementation manner.
The lane-level positioning method provided by the embodiment of the present application is further described with reference to fig. 8. Fig. 8 is a flowchart of a lane-level positioning method according to an embodiment of the present disclosure. As shown in fig. 8, first track data of a vehicle on a first road segment at a first time may be obtained based on positioning information of the first road segment at the first time. And if the first road data of the first road section is not acquired, if the first road section is not covered by the ADAS data, continuing to wait until the road testing data corresponding to the road section at a certain moment is acquired.
And further, determining whether second lane data exist in a second road section where the vehicle is located at a second moment, and if the second lane data exist, determining the transmitting probability and the transition probability based on the acquired lane image, the first lane data and the second lane data. And determining the actual lane in which the vehicle is located at the first moment based on the transmission probability, the transition probability and a third probability that the vehicle is located in each lane of the second road section at the second moment.
And if the second lane data does not exist, determining the first lane data as the second lane data, determining the first road section as the second road section, and determining the first time as the initial time of lane-level positioning. Further, based on the first lane data, the second lane data (first lane data), and the lane image, a transmission probability and a transition probability are determined, and an initial probability distribution is determined based on the number of lanes of the first road section, so that an actual lane where the vehicle is located at the first time is determined based on the transmission probability, the transition probability, and the initial probability distribution.
The data processing, computing and other processes related to the embodiment of the application can be performed based on computer technology, cloud computing and other modes. The cloud Computing is a product of development and fusion of traditional computers and Network Technologies, such as Grid Computing (Grid Computing), Distributed Computing (Distributed Computing), Parallel Computing (Parallel Computing), Utility Computing (Utility Computing), Network Storage (Network Storage Technologies), Virtualization (Virtualization), Load balancing (Load Balance), and the like, and data processing and Computing efficiency in the embodiment of the application can be improved based on the cloud Computing.
In the embodiment of the application, the lane data of each road section are basic road data including lane line types, lane line colors and lane line distribution information, so that the lane data are convenient to acquire and simple in data processing, the lane image is simple in acquisition mode, the equipment dependence is small, and the positioning efficiency of lane level positioning is improved. And the actual lane where the vehicle is located at any moment can be determined based on the lane data of the road section where the vehicle is located at any moment, the lane data of the road section where the vehicle is located at the previous moment and the lane image of the vehicle in front of the vehicle at any moment, so that the vehicle can be positioned at the lane level in real time, and the method and the device are suitable for positioning the vehicle under different road scenes and have high applicability.
Referring to fig. 9, fig. 9 is a schematic structural diagram of a lane-level positioning device provided in an embodiment of the present application. The lane level positioner that this application embodiment provided includes:
the lane data acquiring module 91 is configured to acquire first lane data of a first road segment where a vehicle is located at a first time and second lane data of a second road segment where the vehicle is located at a second time, where the second time is a previous time to the first time;
a lane image obtaining module 92, configured to obtain a lane image in front of the vehicle at a first time, and determine lane line information of each lane line in the lane image;
a lane positioning module 93, configured to determine, based on the first lane data, the second lane data, and lane information of each lane in the lane image, an actual lane where the vehicle is located at the first time.
In some possible embodiments, the lane data acquiring module 91 is configured to:
determining positioning information of the vehicle at a first moment;
determining a first road section where the vehicle is located at the first moment based on the positioning information;
and acquiring first lane data of the first road section.
In some possible embodiments, the lane positioning module 92 is configured to:
determining a first probability that each lane of the first road segment is within a visual field range of the vehicle when the vehicle is on the second road segment based on the first road data and lane line information of each lane line in the lane image;
determining a second probability of each lane of the first road segment being in the first time when the vehicle is in each lane of the second road segment at the second time based on the first road data and the second road data;
determining a third probability that the vehicle is in each lane of the second link at the second time, and determining an actual lane in which the vehicle is located at the first time based on the first probability, the second probability, and the third probability.
In some possible embodiments, the lane positioning module 92 is configured to:
determining lane line information of each lane line of the first road section based on the first lane data;
determining a first probability that each lane of the first link is within a field of view of the vehicle when the vehicle is at the second link based on the number of lane lines in the lane image, lane line information for each lane line in the lane image, and lane line information for each lane line of the first link.
In some possible embodiments, the lane positioning module 92 is configured to:
determining a lane line combination corresponding to each lane of the first road section, wherein each lane line combination comprises lane lines corresponding to the corresponding lanes, and the number of the lane lines is the same as the number of the lane lines in the lane image;
for each lane line combination, determining a matching degree of the lane line combination and each lane line in the lane image based on lane line information of each lane line in the lane line combination and lane line information of each lane line in the lane image, and determining a first probability that a lane corresponding to the lane line combination is within a visual field range of the vehicle when the vehicle is on the second road section based on the matching degree corresponding to the lane line combination.
In some possible embodiments, for each lane line combination, the matching degree of the lane line combination with each lane line in the lane image includes the matching degree of each lane line in the lane line combination with the corresponding lane line in the lane image;
the lane positioning module 92 is configured to:
determining a lane line weight corresponding to each lane line in the lane image based on the lane line information of each lane line in the lane image;
and determining a first probability that the lane corresponding to the lane line combination is in the visual field range of the vehicle in the second road section based on the matching degree of each lane line in the lane line combination and the corresponding lane line in the lane image and the lane line weight corresponding to each lane line in the lane image.
In some possible embodiments, the lane positioning module 92 is configured to:
the lane line information and the lane line information include a lane line type and a lane line color;
for each of the lane line combinations, the lane positioning module 92 is configured to:
determining the type matching degree of the lane line type of each lane line in the lane line combination and the lane line type of the corresponding lane line in the lane image;
determining the color matching degree of the lane line color of each lane line in the lane line combination and the lane line color of the corresponding lane line in the lane image;
and determining the matching degree of each lane line in the lane line combination and the corresponding lane line in the lane image based on the type matching degree and the color matching degree corresponding to each lane line in the lane line combination.
In some possible embodiments, the lane positioning module 92 is configured to:
determining a first distance between two leftmost lane lines and a second distance between two rightmost lane lines in the lane image;
determining a road edge line in the lane image based on the lane line information of each lane line in the lane image;
if the leftmost lane line and the rightmost lane line in the lane image are both road edge lines, or neither the left lane line nor the rightmost lane line is a road edge line, determining each preset weight in the first weight combination as a lane line weight corresponding to each lane line in the lane image;
if the first distance is smaller than a first threshold value, and the leftmost lane line is a road edge line and the rightmost lane line is a non-road edge line, determining each preset weight in the second weight combination as a lane line weight corresponding to each lane line in the lane image;
if the first distance is greater than or equal to the first threshold, and the leftmost lane line is a road edge line and the rightmost lane line is a non-road edge line, determining each preset weight in the third weight combination as a lane line weight corresponding to each lane line in the lane image;
if the second distance is less than a second threshold, and the rightmost lane line is a road edge line and the leftmost lane line is a non-road edge line, determining each preset weight in a fourth weight combination as a lane line weight corresponding to each lane line in the lane image;
and if the second distance is greater than or equal to the second threshold, and the rightmost lane line is a road edge line and the leftmost lane line is a non-road edge line, determining each preset weight in the fifth weight combination as a lane line weight corresponding to each lane line in the lane image.
In some possible embodiments, the lane positioning module 92 is configured to:
determining lane line information of each lane line of the first link and lane distribution information of each lane of the first link based on the first lane data, and determining lane line information of each lane line of the second link and lane distribution information of each lane in the second link based on the second lane data;
determining a target lane based on lane line information of each lane line of the first road segment and the second road segment and lane distribution information of each lane, and determining a preset probability as a second probability of being in the target lane of the first road segment at the first time when the vehicle is in the target lane of the second road segment at the second time;
and determining, based on lane line information of each lane line of the first link and the second link and lane distribution information of each lane, a second probability that the vehicle is in each of the lanes other than the target lane of the first link at the first time and a second probability that the vehicle is in each of the lanes other than the target lane of the second link at the first time when the vehicle is in the target lane of the second link at the second time.
In some possible embodiments, the lane positioning module 92 is configured to:
determining a fourth probability that the vehicle will travel from each lane of the second road segment to each lane of the first road segment based on the first, second, and third probabilities;
determining, for each lane of the first road segment, a fifth probability that the vehicle is in the lane at the first time instant, the maximum probability of the fourth probabilities that the vehicle travels from the lanes of the second road segment to the lane;
and determining the lane corresponding to the maximum probability in the fifth probabilities corresponding to the lanes in the first road section as the actual lane where the vehicle is located at the first moment.
In some possible embodiments, the lane positioning module 92 is configured to:
and determining a third probability that the vehicle is in each lane of the second link at the second time based on the number of lanes in each lane of the first link when the first time is an initial time of lane-level positioning.
In some possible embodiments, the lane positioning module 92 is configured to:
constructing a first coordinate system based on the positioning information of the vehicle at a first moment;
determining a lane line equation of which the two leftmost lane lines in the lane image correspond to the first coordinate system;
and determining a first distance between the two leftmost lane lines in the lane image based on the lane line equation.
In a specific implementation, the lane-level positioning device may execute the implementation manners provided in the steps in fig. 2 and/or fig. 3 through the built-in functional modules, which may specifically refer to the implementation manners provided in the steps, and are not described herein again.
Referring to fig. 10, fig. 10 is a schematic structural diagram of an electronic device provided in an embodiment of the present application. As shown in fig. 10, the electronic device 1000 in the present embodiment may include: the processor 1001, the network interface 1004, and the memory 1005, and the electronic device 1000 may further include: a user interface 1003, and at least one communication bus 1002. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display) and a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface and a standard wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1004 may be a high-speed RAM memory or a non-volatile memory (e.g., at least one disk memory). The memory 1005 may optionally be at least one memory device located remotely from the processor 1001. As shown in fig. 10, a memory 1005, which is a kind of computer-readable storage medium, may include therein an operating system, a network communication module, a user interface module, and a device control application program.
In the electronic device 1000 shown in fig. 10, the network interface 1004 may provide a network communication function; the user interface 1003 is an interface for providing a user with input; and the processor 1001 may be used to invoke a device control application stored in the memory 1005 to implement:
acquiring first lane data of a first road section where a vehicle is located at a first moment and second lane data of a second road section where the vehicle is located at a second moment, wherein the second moment is a previous moment of the first moment;
acquiring a lane image in front of the vehicle at a first moment, and determining lane line information of each lane line in the lane image;
and determining the actual lane where the vehicle is located at the first time based on the first lane data, the second lane data and lane line information of each lane line in the lane image.
In some possible embodiments, the processor 1001 is configured to:
determining positioning information of the vehicle at a first moment;
determining a first road section where the vehicle is located at the first moment based on the positioning information;
and acquiring first lane data of the first road section.
In some possible embodiments, the processor 1001 is configured to:
determining a first probability that each lane of the first road segment is within a visual field range of the vehicle when the vehicle is on the second road segment based on the first road data and lane line information of each lane line in the lane image;
determining a second probability of each lane of the first road segment being in the first time when the vehicle is in each lane of the second road segment at the second time based on the first road data and the second road data;
determining a third probability that the vehicle is in each lane of the second link at the second time, and determining an actual lane in which the vehicle is located at the first time based on the first probability, the second probability, and the third probability.
In some possible embodiments, the processor 1001 is configured to:
determining lane line information of each lane line of the first road section based on the first lane data;
determining a first probability that each lane of the first link is within a field of view of the vehicle when the vehicle is at the second link based on the number of lane lines in the lane image, lane line information for each lane line in the lane image, and lane line information for each lane line of the first link.
In some possible embodiments, the processor 1001 is configured to:
determining a lane line combination corresponding to each lane of the first road section, wherein each lane line combination comprises lane lines corresponding to the corresponding lanes, and the number of the lane lines is the same as the number of the lane lines in the lane image;
for each lane line combination, determining a matching degree of the lane line combination and each lane line in the lane image based on lane line information of each lane line in the lane line combination and lane line information of each lane line in the lane image, and determining a first probability that a lane corresponding to the lane line combination is within a visual field range of the vehicle when the vehicle is on the second road section based on the matching degree corresponding to the lane line combination.
In some possible embodiments, for each lane line combination, the matching degree of the lane line combination with each lane line in the lane image includes the matching degree of each lane line in the lane line combination with the corresponding lane line in the lane image;
the processor 1001 is configured to:
determining a lane line weight corresponding to each lane line in the lane image based on the lane line information of each lane line in the lane image;
and determining a first probability that the lane corresponding to the lane line combination is in the visual field range of the vehicle in the second road section based on the matching degree of each lane line in the lane line combination and the corresponding lane line in the lane image and the lane line weight corresponding to each lane line in the lane image.
In some possible embodiments, the lane line information and the lane line information include a lane line type and a lane line color;
for each lane line combination, the processor 1001 is configured to:
determining the type matching degree of the lane line type of each lane line in the lane line combination and the lane line type of the corresponding lane line in the lane image;
determining the color matching degree of the lane line color of each lane line in the lane line combination and the lane line color of the corresponding lane line in the lane image;
and determining the matching degree of each lane line in the lane line combination and the corresponding lane line in the lane image based on the type matching degree and the color matching degree corresponding to each lane line in the lane line combination.
In some possible embodiments, the processor 1001 is configured to:
determining a first distance between two leftmost lane lines and a second distance between two rightmost lane lines in the lane image;
determining a road edge line in the lane image based on the lane line information of each lane line in the lane image;
if the leftmost lane line and the rightmost lane line in the lane image are both road edge lines, or neither the left lane line nor the rightmost lane line is a road edge line, determining each preset weight in the first weight combination as a lane line weight corresponding to each lane line in the lane image;
if the first distance is smaller than a first threshold value, and the leftmost lane line is a road edge line and the rightmost lane line is a non-road edge line, determining each preset weight in the second weight combination as a lane line weight corresponding to each lane line in the lane image;
if the first distance is greater than or equal to the first threshold, and the leftmost lane line is a road edge line and the rightmost lane line is a non-road edge line, determining each preset weight in the third weight combination as a lane line weight corresponding to each lane line in the lane image;
if the second distance is less than a second threshold, and the rightmost lane line is a road edge line and the leftmost lane line is a non-road edge line, determining each preset weight in a fourth weight combination as a lane line weight corresponding to each lane line in the lane image;
and if the second distance is greater than or equal to the second threshold, and the rightmost lane line is a road edge line and the leftmost lane line is a non-road edge line, determining each preset weight in the fifth weight combination as a lane line weight corresponding to each lane line in the lane image.
In some possible embodiments, the processor 1001 is configured to:
determining lane line information of each lane line of the first link and lane distribution information of each lane of the first link based on the first lane data, and determining lane line information of each lane line of the second link and lane distribution information of each lane in the second link based on the second lane data;
determining a target lane based on lane line information of each lane line of the first road segment and the second road segment and lane distribution information of each lane, and determining a preset probability as a second probability of being in the target lane of the first road segment at the first time when the vehicle is in the target lane of the second road segment at the second time;
and determining, based on lane line information of each lane line of the first link and the second link and lane distribution information of each lane, a second probability that the vehicle is in each of the lanes other than the target lane of the first link at the first time and a second probability that the vehicle is in each of the lanes other than the target lane of the second link at the first time when the vehicle is in the target lane of the second link at the second time.
In some possible embodiments, the processor 1001 is configured to:
determining a fourth probability that the vehicle will travel from each lane of the second road segment to each lane of the first road segment based on the first, second, and third probabilities;
determining, for each lane of the first road segment, a fifth probability that the vehicle is in the lane at the first time instant, the maximum probability of the fourth probabilities that the vehicle travels from the lanes of the second road segment to the lane;
and determining the lane corresponding to the maximum probability in the fifth probabilities corresponding to the lanes in the first road section as the actual lane where the vehicle is located at the first moment.
In some possible embodiments, the processor 1001 is configured to:
and determining a third probability that the vehicle is in each lane of the second link at the second time based on the number of lanes in each lane of the first link when the first time is an initial time of lane-level positioning.
In some possible embodiments, the processor 1001 is configured to:
constructing a first coordinate system based on the positioning information of the vehicle at a first moment;
determining a lane line equation of which the two leftmost lane lines in the lane image correspond to the first coordinate system;
and determining a first distance between the two leftmost lane lines in the lane image based on the lane line equation.
It should be understood that in some possible embodiments, the processor 1001 may be a Central Processing Unit (CPU), and the processor may be other general purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), field-programmable gate arrays (FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The memory may include both read-only memory and random access memory, and provides instructions and data to the processor. The portion of memory may also include non-volatile random access memory. For example, the memory may also store device type information.
In a specific implementation, the electronic device 1000 may execute, through each built-in functional module thereof, the implementation manners provided in each step in fig. 2 and/or fig. 3, which may be referred to specifically for the implementation manners provided in each step, and are not described herein again.
An embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and is executed by a processor to implement the method provided in each step in fig. 2 and/or fig. 3, which may specifically refer to the implementation manner provided in each step, and is not described herein again.
The computer readable storage medium may be the lane-level locating device and/or an internal storage unit of the electronic device, such as a hard disk or a memory of the electronic device. The computer readable storage medium may also be an external storage device of the electronic device, such as a plug-in hard disk, a Smart Memory Card (SMC), a Secure Digital (SD) card, a flash card (flash card), and the like, which are provided on the electronic device. The computer readable storage medium may further include a magnetic disk, an optical disk, a read-only memory (ROM), a Random Access Memory (RAM), and the like. Further, the computer readable storage medium may also include both an internal storage unit and an external storage device of the electronic device. The computer-readable storage medium is used for storing the computer program and other programs and data required by the electronic device. The computer readable storage medium may also be used to temporarily store data that has been output or is to be output.
Embodiments of the present application provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the electronic device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the methods provided by the steps of fig. 2 and/or fig. 3.
The terms "first", "second", and the like in the claims and in the description and drawings of the present application are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or electronic device that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or electronic device. Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments. The term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described in a functional general in the foregoing description for the purpose of illustrating clearly the interchangeability of hardware and software. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The above disclosure is only for the purpose of illustrating the preferred embodiments of the present application and is not intended to limit the scope of the present application, which is defined by the appended claims.

Claims (15)

1. A lane-level positioning method, the method comprising:
acquiring first lane data of a first road section where a vehicle is located at a first moment and second lane data of a second road section where the vehicle is located at a second moment, wherein the second moment is the last moment of the first moment;
acquiring a lane image in front of the vehicle at a first moment, and determining lane line information of each lane line in the lane image;
and determining the actual lane where the vehicle is located at the first moment based on the first lane data, the second lane data and the lane line information of each lane line in the lane image.
2. The method of claim 1, wherein obtaining first roadway data for a first road segment over which the vehicle is located at a first time comprises:
determining positioning information of the vehicle at a first moment;
determining a first road segment where the vehicle is located at the first moment based on the positioning information;
and acquiring first lane data of the first road section.
3. The method of claim 1, wherein determining the actual lane in which the vehicle is located at the first time based on the first lane data, the second lane data, and lane line information for each lane line in the lane image comprises:
determining a first probability that each lane of the first road section is within a visual field range of the vehicle when the vehicle is at the second road section based on the first road data and lane line information of each lane line in the lane image;
determining a second probability of the vehicle being in each lane of the second road segment at the first time based on the first road data and the second road data for each lane of the first road segment at the second time;
and determining a third probability that the vehicle is in each lane of the second road section at the second moment, and determining an actual lane in which the vehicle is located at the first moment based on the first probability, the second probability and the third probability.
4. The method of claim 3, wherein determining a first probability that each lane of the first road segment is within a field of view of the vehicle when on the second road segment based on the first road data and lane line information for each lane line in the lane image comprises:
determining lane line information of each lane line of the first road section based on the first lane data;
determining a first probability that each lane of the first road section is in a visual field range of the vehicle when the vehicle is on the second road section based on the number of lane lines in the lane image, lane line information of each lane line in the lane image, and lane line information of each lane line of the first road section.
5. The method of claim 4, wherein determining a first probability that each lane of the first road segment is within a field of view of the vehicle when in the second road segment based on the number of lane lines in the lane image, lane line information for each lane line in the lane image, and lane line information for each lane line of the first road segment comprises:
determining a lane line combination corresponding to each lane of the first road section, wherein each lane line combination comprises lane lines corresponding to the corresponding lanes, and the number of the lane lines is the same as the number of the lane lines in the lane image;
for each lane line combination, determining the matching degree of the lane line combination and each lane line in the lane image based on the lane line information of each lane line in the lane line combination and the lane line information of each lane line in the lane image, and determining a first probability that the lane corresponding to the lane line combination is in the view field of the vehicle in the second road section based on the matching degree corresponding to the lane line combination.
6. The method of claim 5, wherein, for each of the lane line combinations, the degree of matching of the lane line combination with each lane line in the lane image comprises the degree of matching of each lane line in the lane line combination with the corresponding lane line in the lane image;
the determining, based on the matching degree corresponding to the lane line combination, a first probability that a lane corresponding to the lane line combination is within a visual field range of the vehicle when the vehicle is in the second road segment includes:
determining the lane line weight corresponding to each lane line in the lane image based on the lane line information of each lane line in the lane image;
and determining a first probability that the lane corresponding to the lane line combination is in the view field of the vehicle in the second road section based on the matching degree of each lane line in the lane line combination and the corresponding lane line in the lane image and the lane line weight corresponding to each lane line in the lane image.
7. The method of claim 6, wherein the lane line information and the lane line information include a lane line type and a lane line color;
for each lane line combination, determining the matching degree of each lane line in the lane line combination and the corresponding lane line in the lane image based on the lane line information of each lane line in the lane line combination and the lane line information of each lane line in the lane image, including:
determining the type matching degree of the lane line type of each lane line in the lane line combination and the lane line type of the corresponding lane line in the lane image;
determining the color matching degree of the lane line color of each lane line in the lane line combination and the lane line color of the corresponding lane line in the lane image;
and determining the matching degree of each lane line in the lane line combination and the corresponding lane line in the lane image based on the type matching degree and the color matching degree corresponding to each lane line in the lane line combination.
8. The method of claim 6, wherein determining the lane line weight corresponding to each lane line in the lane image based on the lane line information of each lane line in the lane image comprises:
determining a first distance between two leftmost lane lines and a second distance between two rightmost lane lines in the lane image;
determining a road edge line in the lane image based on the lane line information of each lane line in the lane image;
if the leftmost lane line and the rightmost lane line in the lane image are both road edge lines, or neither the left lane line nor the rightmost lane line is a road edge line, determining each preset weight in the first weight combination as the lane line weight corresponding to each lane line in the lane image;
if the first distance is smaller than a first threshold value, the leftmost lane line is a road edge line, and the rightmost lane line is a non-road edge line, determining each preset weight in a second weight combination as a lane line weight corresponding to each lane line in the lane image;
if the first distance is greater than or equal to the first threshold, and the leftmost lane line is a road edge line and the rightmost lane line is a non-road edge line, determining each preset weight in a third weight combination as a lane line weight corresponding to each lane line in the lane image;
if the second distance is smaller than a second threshold value, and the rightmost lane line is a road edge line and the leftmost lane line is a non-road edge line, determining each preset weight in a fourth weight combination as a lane line weight corresponding to each lane line in the lane image;
and if the second distance is greater than or equal to the second threshold, and the rightmost lane line is a road edge line and the leftmost lane line is a non-road edge line, determining each preset weight in a fifth weight combination as a lane line weight corresponding to each lane line in the lane image.
9. The method of claim 3, wherein determining a second probability that the vehicle is in each lane of the second road segment at the first time based on the first road data and the second road data comprises:
determining lane line information of each lane line of the first road section and lane distribution information of each lane of the first road section based on the first lane data, and determining lane line information of each lane line of the second road section and lane distribution information of each lane in the second road section based on the second lane data;
determining a target lane based on lane line information of each lane line of the first road section and the second road section and lane distribution information of each lane, and determining a preset probability as a second probability of being in the target lane of the first road section at the first moment under the condition that the vehicle is in the target lane of the second road section at the second moment;
determining, based on lane line information of each lane line of the first road segment and the second road segment and lane distribution information of each lane, a second probability that the vehicle is in each lane of the first road segment at the first time when the vehicle is in the target lane of the second road segment at the second time, except the target lane, and a second probability that the vehicle is in each lane of the second road segment at the first time when the vehicle is in each lane of the second road segment at the second time except the target lane.
10. The method of claim 3, wherein the determining the actual lane in which the vehicle is located at the first time based on the first probability, the second probability, and the third probability comprises:
determining a fourth probability that the vehicle travels from lanes of the second road segment to each lane of the first road segment based on the first, second, and third probabilities;
for each lane of the first road segment, determining a fifth probability that the vehicle is in the lane at the first time instant, as a maximum probability of fourth probabilities that the vehicle travels from lanes of the second road segment to the lane;
and determining the lane corresponding to the maximum probability in the fifth probabilities corresponding to all lanes in the first road section as the actual lane where the vehicle is located at the first moment.
11. The method of claim 3, wherein the determining a third probability that the vehicle is in each lane of the second road segment at the second time comprises:
and if the first moment is the initial moment of lane-level positioning, determining a third probability that the vehicle is in each lane of the second road section at the second moment based on the number of lanes of each lane of the first road section.
12. The method of claim 8, wherein the determining a first distance between two leftmost lane lines in the lane image comprises:
constructing a first coordinate system based on the positioning information of the vehicle at a first moment;
determining a lane line equation of which the two leftmost lane lines in the lane image correspond to the first coordinate system;
based on the lane line equation, a first distance between two lane lines on the leftmost side in the lane image is determined.
13. A lane-level locating apparatus, the apparatus comprising:
the lane data acquisition module is used for acquiring first lane data of a first road section where a vehicle is located at a first moment and second lane data of a second road section where the vehicle is located at a second moment, wherein the second moment is the last moment of the first moment;
the lane image acquisition module is used for acquiring a lane image in front of the vehicle at a first moment and determining lane line information of each lane line in the lane image;
and the lane positioning module is used for determining the actual lane where the vehicle is located at the first moment based on the first lane data, the second lane data and the lane line information of each lane line in the lane image.
14. An electronic device comprising a processor and a memory, the processor and the memory being interconnected;
the memory is used for storing a computer program;
the processor is configured to perform the method of any of claims 1 to 12 when the computer program is invoked.
15. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which is executed by a processor to implement the method of any one of claims 1 to 12.
CN202110693316.3A 2021-06-22 2021-06-22 Lane level positioning method, device, equipment and storage medium Pending CN113822124A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110693316.3A CN113822124A (en) 2021-06-22 2021-06-22 Lane level positioning method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110693316.3A CN113822124A (en) 2021-06-22 2021-06-22 Lane level positioning method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113822124A true CN113822124A (en) 2021-12-21

Family

ID=78923957

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110693316.3A Pending CN113822124A (en) 2021-06-22 2021-06-22 Lane level positioning method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113822124A (en)

Similar Documents

Publication Publication Date Title
CN113916242B (en) Lane positioning method and device, storage medium and electronic equipment
CN112204343B (en) Visualization of high definition map data
US10437252B1 (en) High-precision multi-layer visual and semantic map for autonomous driving
US10794710B1 (en) High-precision multi-layer visual and semantic map by autonomous units
EP3759562B1 (en) Camera based localization for autonomous vehicles
US20200160532A1 (en) System and Method for Identifying Travel Way Features for Autonomous Vehicle Motion Control
US20190051056A1 (en) Augmenting reality using semantic segmentation
CN108959321A (en) Parking lot map constructing method, system, mobile terminal and storage medium
CN102208035B (en) Image processing system and position measuring system
US20220011117A1 (en) Positioning technology
CN111783502A (en) Visual information fusion processing method and device based on vehicle-road cooperation and storage medium
CN109086277A (en) A kind of overlay region building ground drawing method, system, mobile terminal and storage medium
CN114299464A (en) Lane positioning method, device and equipment
WO2011160672A1 (en) Method for obtaining drivable road area
KR20200046437A (en) Localization method based on images and map data and apparatus thereof
CN113240734B (en) Vehicle cross-position judging method, device, equipment and medium based on aerial view
WO2022083487A1 (en) Method and apparatus for generating high definition map and computer-readable storage medium
CN113887376A (en) Target detection method, device, medium and equipment
CN112257668A (en) Main and auxiliary road judging method and device, electronic equipment and storage medium
CN110780325B (en) Method and device for positioning moving object and electronic equipment
Chiu et al. Augmented reality driving using semantic geo-registration
Gressenbuch et al. Mona: The munich motion dataset of natural driving
CN113763504B (en) Map updating method, system, vehicle-mounted terminal, server and storage medium
CN112639822B (en) Data processing method and device
CN112651991B (en) Visual positioning method, device and computer system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination