WO2018145602A1 - 车道的确定方法、装置及存储介质 - Google Patents

车道的确定方法、装置及存储介质 Download PDF

Info

Publication number
WO2018145602A1
WO2018145602A1 PCT/CN2018/075052 CN2018075052W WO2018145602A1 WO 2018145602 A1 WO2018145602 A1 WO 2018145602A1 CN 2018075052 W CN2018075052 W CN 2018075052W WO 2018145602 A1 WO2018145602 A1 WO 2018145602A1
Authority
WO
WIPO (PCT)
Prior art keywords
lane
line
feature
vehicle
driving
Prior art date
Application number
PCT/CN2018/075052
Other languages
English (en)
French (fr)
Inventor
付玉锦
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to JP2019524866A priority Critical patent/JP6843990B2/ja
Priority to EP18750637.3A priority patent/EP3534114B1/en
Priority to KR1020217000555A priority patent/KR102266830B1/ko
Priority to KR1020197018486A priority patent/KR20190090393A/ko
Publication of WO2018145602A1 publication Critical patent/WO2018145602A1/zh
Priority to US16/439,496 priority patent/US11094198B2/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3602Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3446Details of route searching algorithms, e.g. Dijkstra, A*, arc-flags, using precalculated routes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/14Adaptive cruise control
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3807Creation or updating of map data characterised by the type of data
    • G01C21/3815Road data
    • G01C21/3819Road shape data, e.g. outline of a route
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/13Receivers
    • G01S19/33Multimode operation in different systems which transmit time stamped messages, e.g. GPS/GLONASS
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Definitions

  • the present application relates to positioning technology, and in particular, to a method and apparatus for determining lanes and a storage medium.
  • a method for determining a lane includes: acquiring image information of a road surface of the vehicle collected from a vehicle; and identifying, from the image information, a first lane of at least one first lane of the road surface Information, wherein the first lane information includes a positional relationship indicating a driving lane in which the vehicle is located in the traveling road surface and the at least one first lane, and a feature of the at least one first lane; Feature of a first lane is feature-matched with features of at least one second lane in the map of the location of the road surface to determine a target lane of the vehicle in the map, wherein the target lane corresponds to the lane in the map
  • the lane, the at least one second lane is characterized by being acquired from a map.
  • a computing device is further provided in accordance with an embodiment of the present application, comprising: a processor and a memory, wherein the memory stores computer readable instructions that are executable by the processor to: obtain travel of the vehicle collected from a vehicle Image information of the road surface; identifying, from the image information, first lane information of at least one first lane of the road surface, wherein the first lane information includes indicating that the vehicle is in the road surface a positional relationship between the driving lane and the first lane, and a feature of the at least one first lane; by characterizing the at least one first lane with at least one second lane of the driving roadway map Characterizing feature matching to determine a target lane of the vehicle in the map, wherein the target lane is a corresponding lane of the driving lane in the map, and the at least one of the second lanes is characterized by Obtained from the map.
  • the embodiment of the present application further provides a non-transitory computer readable storage medium storing computer readable instructions, which may cause at least one processor to perform the method described above.
  • FIG. 1a is a schematic diagram of an implementation environment according to an embodiment of the present application.
  • 1b is a hardware structural diagram of a computing device according to an embodiment of the present application.
  • 2a is a flowchart of a method of determining a lane according to an embodiment of the present application
  • 2b is a schematic diagram of a method for determining a lane according to an embodiment of the present application
  • FIG. 3 is a flowchart of a lane determination method according to an embodiment of the present application.
  • 4a is a flowchart of a lane determination method according to an embodiment of the present application.
  • 4b is a schematic diagram of a lane determination method according to an embodiment of the present application.
  • 4c is a flowchart of a lane determination method according to an embodiment of the present application.
  • 4d is a schematic diagram of a lane determination method according to an embodiment of the present application.
  • FIG. 5a is a flowchart of a method for determining a lane according to an embodiment of the present application.
  • FIG. 5b is a flowchart of a method for determining a lane according to an embodiment of the present application
  • FIG. 6a is a schematic diagram of image processing according to an embodiment of the present application
  • FIG. 6b is a schematic diagram of image processing according to an embodiment of the present application
  • FIG. 6c is a schematic diagram of a lane according to an embodiment of the present application
  • FIG. 7 is a schematic diagram of a lane determining apparatus according to an embodiment of the present application;
  • FIG. 8 is a schematic diagram of a determining device of a target lane according to an embodiment of the present application.
  • FIG. 9 is a structural block diagram of a terminal according to an embodiment of the present application.
  • FIG. 1a is a schematic diagram of an implementation environment of an embodiment of the present application.
  • the server 102 can be connected to an in-vehicle device (also referred to as a terminal device) in the car 104 via a network.
  • the above network may include, but is not limited to, a wide area network, a metropolitan area network, a local area network, and the like.
  • the terminal device 106 may include, but is not limited to, various in-vehicle terminals such as a PC, a mobile phone, a tablet computer, a driving recorder, a car navigation device, and the like.
  • the method of the embodiment of the present application may be performed by the server 102, by the terminal device 106 on the car 104, or by the server 102 and the terminal device 106.
  • the method in which the terminal 106 performs the embodiment of the present application may also be performed by a client installed thereon.
  • FIG. 1b is a structural diagram of a computing device according to an embodiment of the present application.
  • the computing device 100b may be the terminal device 106 in the implementation environment 100a shown in FIG. 1a, or may be the server 102.
  • computing device 100b can include one or more of the following components: processor 101, memory 103, communication module 105.
  • Computing device 100b can utilize communication module 105 to communicate with other devices over a network.
  • the processor 101 may include one or more processors, which may include single-core or multi-core processors, may be in the same physical device, or may be distributed among multiple physical devices.
  • the processor 101 can execute.
  • the memory 103 can include an operating system 107, a communication module 108, and a lane determination module 109.
  • Lane determination module 109 can include computer readable instructions. These computer readable instructions may cause processor 101 to perform the lane determination method of various embodiments of the present application.
  • FIG. 2a is a flowchart of a method for determining a lane according to an embodiment of the present application. As shown in FIG. 2a, the method may include the following steps.
  • S201 Acquire image information of a road surface of the vehicle collected from the vehicle.
  • the image capture device can be mounted on the vehicle such that it acquires image information for the road surface of the vehicle.
  • the image information may include a picture and a video.
  • the image capture device can include a camera, a thermal imager, and the like.
  • the image capture device can be mounted in front of or behind the vehicle.
  • the image capture device can be mounted in the middle of the front windshield or rear windshield so that the camera is as parallel as possible to the road surface and the optical axis is directed directly in front of or behind the road.
  • the vanishing point of such a road is near the center of the video image, and the lane lines on both sides of the vehicle can fall in the video image.
  • S202 Identify, from the image information, first lane information of at least one first lane of the traveling road surface.
  • the first lane information includes a positional relationship between a driving lane of the vehicle in the driving road surface and the at least one first lane, and a feature of the at least one first lane.
  • the driving lane of the vehicle may be determined in the image information, and the positional relationship may be determined based on the determined driving lane and the characteristics of the first lane.
  • the driving lane of the vehicle refers to the lane in which the vehicle is located in the road surface
  • the first lane refers to the lane recognized from the image information.
  • the positional relationship is a positional relationship between the driving lane and the first lane.
  • the positional relationship may be: the driving lane is on the left or right side of the first lane, or the second lane is located outside the first lane and on the left side.
  • the positional relationship may be: the driving lane is the second lane from the left in the first lane.
  • the positional relationship between the driving lane and the first lane may also be expressed as a positional relationship between the driving lane and the lane line of the first lane.
  • the positional relationship may be that the driving lane is located in the lane of the left lane 2 and the left 3 lanes of the 6 lanes of the first lane.
  • the lane in which the center line of the image is located in the image information may be determined as the driving lane.
  • the center line 21 of the image 20 is taken along the direction in which the lane extends, and the lane L3 where the center line is located is determined as the exercise lane.
  • a lane having the largest lane width among the image information may be determined as the driving lane.
  • the driving lane of the vehicle may be determined according to the widths W1, W2, W3, W4, and W5 of the lanes on the side of the vehicle in each of the lanes L1, L2, L3, L4, and L5. That is, the width W3 of the lane L3 is the largest, and thus the lane L3 can be determined as the traveling lane.
  • the driving lane may be determined based on a shape of a lane in the image information.
  • two parallel auxiliary lines 22, 23 may be made in the image 20 in a direction perpendicular to the direction in which the lane extends, to obtain a trapezoid corresponding to each lane; among these trapezoids, one of the auxiliary lines
  • the upper two corners are all acute angles (such as the angles A1, A2 in Fig. 2b) and the lane L3 corresponding to the trapezoid T3 serves as the exercising lane.
  • two parallel lines 22 and 23 perpendicular to the direction in which the lane extends may be formed, and a lane sandwiched by two adjacent lane lines having the largest trapezoidal area formed by the two parallel lines may be determined as a driving lane.
  • the area of the trapezoidal T3 is the largest in 2b, so it can be determined that the lane L3 is the driving lane.
  • the driving lane may be determined based on an angle of a lane line in the image information. For example, a plurality of lane lines may be identified in the image, and lanes defined by adjacent two lane lines having opposite tilt directions are determined as exercise lanes. For another example, a lane defined by two adjacent lane lines having the largest angle of clamping may be determined as a driving lane.
  • S203 Determine a target lane of the vehicle in the map by performing feature matching on a feature of the at least one first lane with a feature of at least one second lane in the map of the roadway location.
  • the target lane is a lane corresponding to the driving lane in the map, and at least one of the second lane and the at least one second lane is characterized by being acquired from the map.
  • the lane information extracted from the image acquired on the vehicle is feature-matched with the lane information of the geographic location obtained in the map, thereby determining that the driving lane of the vehicle is in the corresponding target lane in the map, Position the vehicle to the lane.
  • the lane information of the vehicle is very useful in technologies such as vehicle navigation and automatic driving, and can help improve the accuracy of navigation and improve the safety of automatic driving.
  • FIG. 3 is a flowchart of a lane determination method according to an embodiment of the present application. As shown in FIG. 3, the method can include the following steps.
  • S301 Identify, from the image information of the traveling road surface collected on the vehicle, a first lane line feature of the first lane in the traveling road surface, and a positional relationship between the driving lane and the first lane line.
  • the first lane line feature may include features of one or more first lane lines.
  • S302 Acquire a second lane line feature of the road where the geographical location of the vehicle is located from the map.
  • the second lane line feature may include features of one or more second lane lines.
  • S303 Determine, by using the second lane line feature, the first lane line feature, and the foregoing location relationship, the target lane corresponding to the driving lane in the map.
  • the feature of the lane line identified from the image is used to perform feature matching with the feature of the lane line of the corresponding road surface in the map, thereby determining the target lane of the vehicle's driving lane in the map, and the lane determination can be reduced. The amount of calculation required.
  • 4a is a flowchart of a lane determination method according to an embodiment of the present application. As shown in Figure 4a, the method can include the following steps.
  • S401 Determine a third lane line corresponding to the first lane line in the second lane line by comparing the first lane line feature with the second lane line feature.
  • S402 Determine, according to the positional relationship between the third lane line and the exercise lane and the first lane line, the corresponding target lane of the driving lane in the map.
  • the one or more second lines may be a lane line having a line type of the first lane line in the lane line as the third lane line; determining, according to the third lane line and the positional relationship, a corresponding location of the driving lane in the map The target lane.
  • the line type of the lane line may include a solid line, a broken line, a double solid line, a straight line, a curve, and the like.
  • the line type is a broken line
  • the positional relationship between the driving lane of the vehicle and the lane line is identified from the image information that the driving lane is located in the lane line.
  • the line types are solid, dashed and solid.
  • a lane line having a broken line may be used as a lane lane corresponding to the first lane line identified in the image, that is, a third lane line. According to the third lane line and the positional relationship, it is determined that the lane to the left of the third lane line is the target lane in the map.
  • the plurality of lines may be A plurality of lane lines having a line type and an arrangement of the plurality of first lane lines in the second lane line are used as the third lane line. And determining, according to the determined third lane line and the position relationship, the corresponding target lane of the driving lane in the map.
  • the plurality of first lane lines are identified from the image information acquired by the camera and the line type and arrangement in the first lane line feature are solid lines, broken lines, dashed lines, and broken lines, and from the image
  • the information identifies that the positional relationship between the vehicle and the four first lane lines is that the vehicle is located in the middle of the four first lane lines, that is, the lane lines corresponding to the first two line types are two on the left side of the vehicle.
  • Lane line the lane line corresponding to the last two line types is the two lane lines on the right side of the vehicle; meanwhile, the map provides the second lane as a road with 5 lanes, that is, 6 second lane lines,
  • the second lane line of the second lane line is characterized by a solid line, a broken line, a broken line, a broken line, a broken line, and a solid line.
  • a lane line having a line type such as a solid line, a broken line, a broken line, and a broken line, and a lane line characteristic of the first lane line feature of the arrangement may be used as the third lane, that is, the first to fourth lane lines are The third lane.
  • the determined third lane line and the positional relationship it can be determined that the vehicle is located in the middle of the third lane line, that is, on the right side of the second lane line or the left side of the third lane line, so Determine that the target lane is the second lane from the left side of the map.
  • symbols can be used to indicate the line type of the lane line
  • a symbol string is used to indicate the arrangement of the plurality of lane lines. For example, according to a preset correspondence between the line type and the symbol, the symbols corresponding to the line type of the first lane line are organized according to the arrangement manner of the first lane line, and the first lane line feature is generated.
  • the symbols corresponding to the line type of the second lane line are organized according to the arrangement manner of the second lane line, and the second symbol representing the second lane line feature is generated a symbol string; comparing the first symbol string with the second symbol string, and the plurality of symbols in the second symbol string that are identical to the first symbol string are corresponding to the plurality of lane lines as the third lane line; Determining, by the third lane line and the positional relationship, the target lane corresponding to the driving lane in the map. For example, as shown in FIG.
  • the line types and arrangement patterns of the four first lane lines J1, J2, J3, and J4 identified from the image information are ⁇ J1 solid line, J2 dotted line, J3 broken line, and J4 broken line ⁇ .
  • a dotted line is indicated by 0 and a solid line is indicated by 1
  • a first symbol string indicating the characteristics of the first lane line is obtained as "1000”.
  • six second lane lines K1, K2, K3, K4, K5, and K6 are obtained from the map.
  • the line type and arrangement are ⁇ K1 solid line, K2 dotted line, K3 dotted line, K4 dotted line, K5 dotted line, K6.
  • the solid line ⁇ can obtain the second symbol string indicating the second lane line feature as "100001".
  • the four lane lines K1, K2, K3, and K4 corresponding to the first four symbols in the string are the third lanes corresponding to the first lane lines J1, J2, J3, and J4 in the map.
  • the driving lane is between the second and third lane lines from the left, and the lane between the second and third lane lines in the third lane can be determined as
  • the target lane that is, the lane between K2 and K3 is the target lane.
  • FIG. 4c is a flowchart of a lane determination method according to an embodiment of the present application. As shown in Figure 4c, the method can include the following steps.
  • S411 Determine a lane description of the driving lane according to the first lane line feature and the positional relationship between the lane and the first lane.
  • S412 Determine a lane description of each of the at least one second lane according to the characteristics of the second lane line.
  • the lane description of one lane may include features of one or more lane lines and a positional relationship of the one or more lane lanes to the lane.
  • the lane description of each second lane is determined when the lane description of the lane of travel includes a line type of a first lane line and a positional relationship of the first lane line to the lane of travel.
  • the lane description of each second lane may include a line type of a second lane line and a positional relationship of the second lane line and the second lane.
  • a lane in which the line type and the positional relationship in the lane description in the lane description are consistent with the line type and the positional relationship in the lane description of the traveling lane may be determined as the target lane.
  • the lane is located to the left of the first lane line, and the lane description of the lane can be determined, for example, "virtual left”. If three second lane lines are obtained from the map, and the line type and arrangement pattern of the second lane line are solid lines, broken lines and solid lines, then the two second lanes A defined by the three second lane lines can be determined. In B, the lane of lane A is described as "real left, virtual right", and the lane of lane B is described as "virtual left, real right”.
  • the lane description of the driving lane when the lane description of the driving lane includes a line type of the plurality of first lane lines, a first arrangement manner, and a positional relationship between the plurality of first lane lines and the driving lane, determining each of the second Lane description of the lane.
  • the lane description of each second lane includes a line type of the plurality of second lane lines, a second arrangement manner, and a positional relationship of the plurality of second lane lines and the second lane.
  • the line type and arrangement are ⁇ solid line, dashed line, dashed line, dashed line ⁇ , and the driving lane is located in the middle two lanes of the four first lane lines. Between the lines, it is therefore possible to determine the lane description of the driving lane, for example, it can be "solid line, dashed line, driving lane, dotted line, dashed line".
  • the 6 second lane lines and the 5 second lanes defined by the map are obtained from the map.
  • the line types and arrangement patterns of the 6 second lane lines are ⁇ solid lines, dashed lines, dashed lines, dashed lines, solid lines ⁇ .
  • the lane descriptions of the five second lanes can be determined, namely lane A ⁇ solid line, lane A, dashed line, dashed line, dashed line, solid line ⁇ , lane B ⁇ solid line, dotted line, lane B, dotted line, dotted line, Dotted line, solid line ⁇ , lane C ⁇ solid line, dashed line, dashed line, lane C, dashed line, dashed line, solid line ⁇ , lane D ⁇ solid line, dashed line, dashed line, dotted line, lane D, dashed line, solid line ⁇ , and lane E ⁇ solid line, dashed line, dashed line, dotted line, dotted line, lane E, solid line ⁇ .
  • the lane description of lane B is consistent with the lane description of the lane of travel, so it can be determined that lane B is the corresponding target lane of the lane in the map.
  • the symbols corresponding to the first lane lines in the lane description of the driving lane may be organized according to the first arrangement manner according to a preset line type and a correspondence relationship between the position relationship and the symbol, and generated.
  • a first symbol string representing a lane description of the driving lane line; and the symbols corresponding to the plurality of second lane lines in the lane description of each second lane are organized according to the second arrangement according to the correspondence relationship Generating a second symbol string representing the lane description of each second lane; comparing the first symbol string with the second symbol string of each second lane, and matching the first symbol string with the second symbol string
  • the second lane corresponding to the symbol string is determined as the target lane. For example, as shown in FIG.
  • the line types and arrangement patterns of the four first lane lines J1, J2, J3, and J4 identified from the image information are ⁇ J1 solid line, J2 dotted line, J3 dotted line, and J4 broken line ⁇
  • the positional relationship between the traveling lane L0 and the first lane line is that the driving lane is between the lane lines J2 and J3.
  • a dotted line is indicated by 0
  • a solid line is indicated by 1
  • a lane is indicated by Q
  • the first symbol string indicating the lane description of the traveling lane can be obtained as "10Q00".
  • six second lane lines K1, K2, K3, K4, K5, and K6 are obtained from the map.
  • the line type and arrangement are ⁇ K1 solid line, K2 dotted line, K3 dotted line, K4 dotted line, K5 dotted line, K6.
  • the solid line ⁇ , the second symbol string of the lane description of the five second lanes can be obtained as L1: "1Q00001", L2: “10Q0001”, L3: "100Q001”, L4: "1000Q01”, L5: "10000Q1” . Comparing the first symbol string with each second symbol string, it can be determined that the symbol string "10Q0001" of the lane L2 coincides with the first symbol string "10Q00" of the traveling lane L0, so it can be determined that L2 is the driving lane L0 in the map. Corresponding target lane.
  • the second lane or the second lane line acquired from the map may be a lane or lane line that is consistent with the direction of travel of the vehicle from the map based on the direction of travel of the vehicle.
  • FIG. 5a is a flowchart of a method for determining an optional lane according to an embodiment of the present application. As shown in FIG. 5a, the method may include the following steps:
  • Step S501 performing image acquisition on the traveling road surface of the vehicle to obtain image information of the traveling road surface
  • Step S502 identifying first lane information of the traveling road surface from the image information, where the first lane information includes a feature of the first lane and the first lane for indicating the lane position of the vehicle on the road surface;
  • Step S503 determining, by feature matching the feature of the first lane with the feature of the at least one second lane in the map of the roadway location, the target lane of the vehicle in the map, the at least one second lane and the at least one second lane are characterized by At least one second lane obtained from the map includes a target lane.
  • the first lane information is identified by performing image processing on the image information of the traveling road surface, and then the feature of the first lane is matched with the feature of at least one second lane in the map of the driving road surface.
  • the target lane of the vehicle in the map can realize the lateral positioning of the vehicle only through the existing equipment of the current automatic driving vehicle, and determine the actual lane, which can solve the technical problem of high input cost when the vehicle is accurately positioned. In order to achieve the technical effect of accurately positioning the vehicle and reducing the cost.
  • the above-mentioned traveling road surface is the road surface on which the vehicle is currently traveling;
  • the above image information is an image having various features of the traveling road surface, such as a color or black and white picture, a color or black and white video, a thermal imaging picture, etc.;
  • the above features include lane line features, lanes Width, lane type, road sign, traffic sign, speed limit sign, etc., lane line features include at least dashed lines, solid lines, straight lines, curves, and so on.
  • Lane types can include highways, national highways, and the like.
  • the above-described vehicles may be automatic traveling vehicles, unmanned vehicles, various types of automobiles, and the like.
  • the first lane indicated by the first lane information may be a lane in which the vehicle actually travels, or may be any lane having a relative positional relationship with a lane in which the vehicle actually travels.
  • the above method of the embodiments of the present application is mainly used for lateral positioning of a vehicle, including but not limited to lateral positioning for an automated traveling vehicle.
  • the vehicle may have a pressing line (ie, a lane line), for the convenience of description, in the embodiment of the present application, in the case of the vehicle pressing line, as long as the width of the vehicle reaches in a certain lane. 50%, the vehicle is considered to be in the lane.
  • a pressing line ie, a lane line
  • driverless cars have been one of the hottest projects in the automotive industry in various countries. Because the driverless car is in a dangerous working environment, and the military application has infinite application prospects, it also attracts the investment of governments.
  • the current self-driving vehicles are limited to Closed places or specific purposes, with the development of technology, in recent years, many car companies have begun to show the technology of autonomous driving, and even some car manufacturers predict that this technology will be popularized by 2020.
  • GPS/BDS application The technical solution of the embodiment of the present application focuses on solving the lateral positioning of the driverless car, and weakens the description related to the application of the GPS/BDS. When the positioning accuracy of the GPS/BDS is not high, it is an aid to the positioning of the GPS/BDS.
  • the lateral positioning of the vehicle is more fundamental and important than the longitudinal positioning, especially during the driving of the vehicle. This is because the driverless car travels on a path planned in advance on a high-precision map, and the path must be at least accurate to the lane level.
  • the vehicle As long as the vehicle does not encounter unexpected situations or intersections, it can continue to travel along the path. Therefore, in the longitudinal direction, the vehicle is more advanced or lagging than its actual position, and it does not affect the driverless car. Large, but in the horizontal direction, accurate location information is required to be provided to the vehicle. The vehicle can only make decisions and plan the next driving action only if it knows the specific location of the current lane. With the technical solution of the embodiment of the present application, the accurate positioning of the vehicle in the lateral position can be realized, and the embodiment of the present application is described in detail below with reference to FIG. 2 :
  • step S501 when image acquisition of the road surface of the vehicle is performed to obtain image information of the road surface, the vehicle camera may be used to collect pictures or videos of the road surface of the vehicle; and the vehicle thermal imager may be used to collect the road surface of the vehicle. Thermal imaging map.
  • the camera that detects the lane line can be installed in front of the vehicle, and is generally installed in the middle position on the front windshield, so that the camera is as parallel as possible to the road surface and the optical axis is directed to the front of the road, so that the vanishing point of the road is Near the center of the video image, the lane lines on both sides of the vehicle can fall in the video image.
  • the equivalent focal length of the selected camera should not be too large to ensure sufficient viewing angle to capture lane lines on both sides of the vehicle.
  • the collection area of the collecting device is also determined, that is, the lane position of the vehicle in the collecting area is relatively fixed, for example, for the above installation.
  • the lane in which the middle position of the acquisition width is located is the lane in which the vehicle is located.
  • the image information when the first lane information of the traveling road surface is recognized from the image information, the image information may be ortho-projected; and the feature information may be extracted from the image information after the orthographic projection processing; The image information after feature extraction is subjected to perspective back projection processing.
  • the lane line and the line type interference are difficult to distinguish, especially the lane line of the dotted line type, which often appears as a short line segment in the image, so that it is difficult to completely distinguish it from some linear interferences.
  • the lane line of the dotted line type is mapped to the line segment and parallel to the other lane lines, as shown in FIG. 5, and the line interference is not parallel or the spacing is abnormal compared to the other lane lines. This makes it easy to remove it.
  • lane lines (such as virtual solid lines, line widths, and the like) of each lane and each lane in the image can be identified.
  • one or more lanes (ie, the first lane) in the image may be identified, and the target lane is determined by the one or more lanes, preferably, By improving the recognition accuracy, all the first lanes in the image can be identified, so that by comparing the characteristics of the plurality of lanes, the lane in which the vehicle is actually located can be more accurately determined.
  • step S503 before the target lane in the map is determined by performing feature matching between the feature of the first lane and the feature of the at least one second lane in the map of the road surface location, the following manner may be obtained.
  • the characteristics of the two lanes obtaining the satellite positioning information of the vehicle, that is, acquiring the satellite positioning information of the vehicle through the satellite positioning system such as the vehicle GPS/BDS/GLONASS, which is limited by the positioning accuracy and the environment, and the satellite positioning information can only be used.
  • the accuracy is low, so further precise positioning of the present application is needed; and the map of the road section where the road surface is located is obtained according to the satellite positioning information, and the map carries at least one second lane and at least one The characteristics of the two lanes.
  • the map used above is a high-precision map.
  • the map of the road section where the road surface is located can be used to locate the map of the specific road section in the vehicle map.
  • the map of the road section where the road surface is located can also be obtained from the high-precision map of the Internet by online acquisition. .
  • the above-mentioned lane features mainly include a lane line feature and a lane width.
  • the method is: finding a lane lane feature and a lane lane having a lane width matching a lane lane feature and a lane width of the first lane from the at least one second lane, where the third lane is a lane corresponding to the first lane in the map, For example, a third lane in which the lane line is a solid line and a lane width of 3 meters is found from the second lane; determining that the lane having a relative positional relationship with the third lane in the at least one second lane is a target of the vehicle in the map
  • the lane, the relative positional relationship is a positional relationship between the driving lane of the vehicle on the driving road surface and the first lane.
  • the identified first lane is a positional relationship between the driving lane of the vehicle on the driving road surface and the first lane.
  • the first lane is one
  • the features of the lane solid line may be considered to be difficult to match to the unique result
  • the lane line features are searched from the at least one second lane and When the lane width is the third lane matching the lane line feature of the first lane and the lane width, a plurality of fourth lanes whose number and features match the plurality of first lanes are found in the plurality of second lanes.
  • a plurality of first lanes are identified in the picture.
  • the plurality of first lanes are adjacent lanes, for example, three consecutive lanes, and the road width is actually five lanes, then Match these three lanes with five lanes.
  • any lane is a third lane in which the lane line feature and the lane width match the lane line feature and lane width of the first lane.
  • the target lane is determined in the fourth lane. For example, three first lanes are identified, and the driving lane is a lane in the middle position among the three adjacent first lanes, and three consecutive three lanes matching the three first lanes are determined. After the lane (ie, the fourth lane), it is possible to determine that the lane in the middle of the three fourth lanes is the target lane.
  • the target lane has a first confidence level for indicating the accuracy of the determination
  • the vehicle is determined by performing feature matching between the feature of the first lane and the feature of the at least one second lane in the map of the location of the road surface.
  • the fifth lane of the vehicle in which the positioning sensor locates the vehicle is obtained, and the fifth lane has a second confidence level for indicating the positioning accuracy; from the target lane and the fifth lane
  • the lane in which the confidence degree satisfies the preset condition is selected is the actual lane of the vehicle in the map, and the preset condition is a screening condition for determining the actual lane of the vehicle in the map.
  • the positioning results of the plurality of sensors can be referred to, and the positioning results of the respective sensors are merged according to the confidence, and finally the specific position of the vehicle is determined, for example, the lane with higher confidence is the final result.
  • the lane can be used as a final recognition result, and the driving of the vehicle can be controlled according to the result, thereby improving the safety of driving the vehicle. .
  • the lateral position of the vehicle in addition to determining the driving lane of the vehicle by the above method, the lateral position of the vehicle may be more specifically positioned, and the image information is identified. After driving the first lane information of the road surface, the lane of the vehicle and the first lane may be determined according to the position of the center line of the image information in the lane direction (that is, the relative position of the vehicle in the image) and the position of the lane line of the first lane. The distance between the lines.
  • the middle position among the collection widths is the position where the vehicle is located, that is, for any acquired image
  • the center position of the image in the lateral direction is the position of the vehicle.
  • the distance between the vehicle and the lane line can be calculated according to the center position, the lane line, and the vehicle width. In order to control the movement of the vehicle in the lateral direction. If the camera is not installed in the center of the vehicle, since the distance between the camera and the center position of the vehicle is determined, it can also be implemented by the above method.
  • the technical solution of the embodiment of the present application can be applied to the positioning requirement of an unmanned vehicle, and can be integrated with other vehicle positioning solutions to improve the accuracy and stability of the vehicle positioning as a whole, and can also be used for lane keeping of an ADAS (Advanced Driver Assistance System). And the lane change reminder to ensure the safe driving of the vehicle.
  • ADAS Advanced Driver Assistance System
  • Ordinary GPS is responsible for coarse positioning, and its positioning accuracy is generally about 1 ⁇ 10m. Such positioning accuracy is not enough to meet the needs of unmanned vehicles, but it is enough to locate a piece of road data in a high-precision map around the vehicle as a vehicle positioning. Reference. The precise positioning of the lane line detected by the camera is to find the best matching position in the road data. The embodiment will be described in detail below with reference to the steps shown in FIG. 5b. The specific positioning steps are as follows:
  • Step S511 acquiring coordinates of the current position of the vehicle by using GPS.
  • Step S512 determining road information in a high-precision map around the vehicle from the coordinates of the current position, including a total of several lane lines and a line type and a distance width of each lane line.
  • step S513 the line type of each lane line is determined according to the two lane lines closest to the left and right sides of the current vehicle detected by the camera (a total of four lane lines, if any).
  • the camera collects the road surface of the vehicle in real time to obtain image information of the road surface, and the client running on the terminal identifies the collected image information, and determines the lane lines in the image.
  • Lane information such as lane line characteristics and lane width.
  • step S514 the detected lane line patterns are sequentially compared with the lane line patterns in the previously obtained high-precision map, thereby determining the position of the lane in which the vehicle is currently located.
  • step S515 when the vehicle changes lanes, the distance between the vehicle and the left and right lane lines may change, thereby updating the position of the lanes carried by the vehicle.
  • step S566 for the currently positioned vehicle lateral position, a confidence is assigned based on the previously compared position change and the length of the detected lane line.
  • the client running on the terminal device configured on the vehicle compares the current frame of the image information of the road surface of the vehicle collected by the camera with the previous frame, if the position of the vehicle in the current frame is If the position of the vehicle changes in the previous frame, the confidence of the lane in which the current vehicle is located is lowered; or the confidence is given to the type of lane line recognized by the client on the image information of the road surface of the vehicle.
  • the degree is given a high degree of confidence for the identified lane line of the lane line type being a solid line, and a low confidence level for the lane line of the identified lane line type being a dotted line.
  • Step S517 when the confidence value is greater than a certain threshold, the lane keeping strategy is executed, thereby suspending the lane line alignment.
  • image processing can be performed in the manner as shown in FIG. 6a: video image capture, image color to grayscale, image smoothing denoising, image gamma correction, image binarization processing, morphological repair, skeleton Extraction, Hough filtering, orthographic projection, lane line correction, Hough extraction lane line, perspective back projection, lane line fusion, Kalman filtering, missing lane line estimation, lane line virtual reality judgment, and vehicle position estimation.
  • the first step of lane line detection is to intercept the video image (as shown in the figure). 6b neutron diagram b)), only the area of interest is reserved.
  • the color image is then transformed into a grayscale image, and then the image is smoothed and denoised using a bilateral filtering method.
  • the image is subjected to Gamma correction before the image is binarized.
  • the morphological operation is used to repair the hole, smooth the boundary, and then extract the lane line center line by the skeleton extraction algorithm (as shown in sub-graph d in Figure 6b). Based on this, the Hough transform results are used for local filtering to remove interference and glitch.
  • the perspective projection image of the lane line is obtained (as shown in the sub-element of Figure 6b), which is converted into an orthographic projection image according to the internal and external parameters of the camera (as shown in sub-figure f in Figure 6b), and then the shape Learn smoothing and skeleton extraction to get a front view of the lane line.
  • the lane line correction is first performed, mainly to process the curve, and then the Hough transform extracts the lane line (as shown in the sub-g of Figure 6b), and then removes the error extraction according to the distance constraint between the lane lines.
  • Lane line (as shown in sub-picture h in Figure 6b).
  • the remaining lane lines are backprojected back into the perspective projection image and merged with the lane lines in the previous perspective projection image to obtain the final detected lane line.
  • Kalman filtering processing can be performed for each lane line, and position estimation of a lane line that is temporarily missing due to occlusion or the like is performed. Finally, based on the continuity of the lane lines in the continuous frame image and the length of the lane lines, the virtual reality of each lane line is determined.
  • the lane line layout of the road is as shown in FIG. 6c, and the lane line is coded with 0 and 1, that is, when the lane line is a dotted line, it is marked as 0, and when the lane line is a solid line, it is marked as 1
  • a road with 5 lanes codes coded 1 to 5
  • the lane line detected by the camera is also encoded.
  • the code of the four lane lines is detected as 1000, where the first two digits represent the two lane lines on the left side of the vehicle, and the last two digits represent the vehicle right. The two lane lines on the side, then the current longitudinal position of the vehicle is in the second lane.
  • the detected four lane lines are coded as 0000, the current longitudinal position of the vehicle is within the third lane.
  • the positioning of the vehicle is one of the key technologies for the vehicle to achieve automatic driving.
  • the vehicle needs to accurately sense its surroundings by locating its relative position.
  • the embodiment of the present application relates to a horizontal positioning method based on a high-precision map, which refers to positioning perpendicular to the traveling direction of the vehicle, for example, the vehicle is traveling on a certain lane, and the specific position compared to the lane line. .
  • a high-precision map refers to positioning perpendicular to the traveling direction of the vehicle, for example, the vehicle is traveling on a certain lane, and the specific position compared to the lane line.
  • the high-precision data of the current road lane line includes the line type of the lane line (dashed line or solid line), the relative position and width between the lane line and the lane line, and the real-time detection is performed by the camera installed at the front end of the vehicle.
  • the identified plurality of lane line information matches the known information to locate which lane on the road the vehicle is traveling on, as well as the specific location of the lane line.
  • the first lane information is identified by performing image processing on the image information of the traveling road surface, and then the feature of the first lane is matched with the feature of the at least one second lane in the map of the road surface location to determine
  • the target lane of the vehicle in the map can realize the lateral positioning of the vehicle only through the existing equipment of the current automatic driving vehicle, and determine the actual lane, which can solve the technical problem of high input cost when the vehicle is accurately positioned. In order to achieve the technical effect of accurately positioning the vehicle and reducing the cost.
  • the method according to the above embodiment can be implemented by means of software plus a necessary general hardware platform, and of course, by hardware, but in many cases, the former is A better implementation.
  • the technical solution of the embodiments of the present application may be embodied in the form of a software product in essence or in the form of a software product stored in a storage medium (such as ROM/RAM, magnetic
  • the discs and the discs include a plurality of instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method described in the embodiments of the present application.
  • the present application also provides a determination device for a lane for implementing the above-described determination method of a lane.
  • 7 is a schematic diagram of a lane determining apparatus according to an embodiment of the present application.
  • the apparatus may include: one or more memories; one or more processors; wherein the one or more The memory stores one or more instruction modules that are configured to be executed by the one or more processors; wherein the one or more instruction modules include an acquisition unit 72, an identification unit 74, and a first determination unit 76.
  • the collecting unit 72 is configured to perform image acquisition on the traveling road surface of the vehicle to obtain image information of the traveling road surface;
  • the identification unit 74 is configured to identify first lane information of the driving road surface from the image information, wherein the first lane information includes a feature of the first lane and the first lane for indicating the lane position of the vehicle on the road surface;
  • a first determining unit 76 configured to determine a target lane of the vehicle in the map by performing feature matching on a feature of the first lane and a feature of the at least one second lane in the map of the location of the road surface, wherein the at least one second lane and at least A second lane is characterized by being acquired from a map, and at least one second lane includes a target lane.
  • the collecting unit 72 in this embodiment may be used to perform step S511 in the first embodiment of the present application.
  • the identifying unit 74 in the embodiment may be used to perform step S512 in the embodiment 1 of the present application.
  • the first determining unit 76 in the embodiment may be used to perform step S513 in Embodiment 1 of the present application.
  • modules are the same as the examples and application scenarios implemented by the corresponding steps, but are not limited to the contents disclosed in the above embodiment 1. It should be noted that the foregoing module may be implemented in a hardware environment as shown in FIG. 1 as part of the device, and may be implemented by software or by hardware.
  • the first lane information is identified by performing image processing on the image information of the traveling road surface, and then the feature of the first lane is matched with the feature of at least one second lane in the map of the road surface location to determine that the vehicle is in the map.
  • the target lane can realize the lateral positioning of the vehicle only through the existing equipment of the current automatic driving vehicle, and determine the actual lane, which can solve the technical problem of high input cost when the vehicle is accurately positioned, and then achieve the right The technical effect of the vehicle to accurately locate and reduce costs.
  • the above-mentioned traveling road surface is the road surface on which the vehicle is currently traveling;
  • the above image information is an image having various features of the traveling road surface, such as a color or black and white picture, a color or black and white video, a thermal imaging picture, etc.;
  • the above features include lane line features, lanes
  • the width and the like, the lane line features include at least a broken line and a solid line;
  • the above-mentioned vehicles may be automatic traveling vehicles, various types of vehicles, and the like.
  • the first lane indicated by the first lane information may be a lane in which the vehicle actually travels, or may be any lane having a relative positional relationship with a lane in which the vehicle actually travels.
  • the above described apparatus of the embodiments of the present application is mainly used for lateral positioning of a vehicle, including but not limited to lateral positioning for an automatically traveling vehicle.
  • the vehicle may have a pressing line (ie, a lane line), for the convenience of description, in the embodiment of the present application, in the case of the vehicle pressing line, as long as the width of the vehicle reaches in a certain lane. 50%, the vehicle is considered to be in the lane.
  • a pressing line ie, a lane line
  • the identification unit includes: a first processing module configured to perform orthographic projection processing on the image information; an extraction module configured to perform feature extraction from the image information after the orthographic projection processing; and a second processing module, It is used for performing perspective back projection processing on image information after feature extraction.
  • lane lines (such as virtual solid lines, line widths, and the like) of each lane and each lane in the image can be identified.
  • the apparatus of the embodiment of the present application may further include: a second obtaining unit 82, configured to pass at least one second lane in the map of the location of the first lane and the location of the road surface The feature is matched to determine the satellite positioning information of the vehicle before the target lane in the map is determined; the third obtaining unit 84 is configured to acquire a map of the road segment where the driving road surface is located according to the satellite positioning information, where the map carries at least one Features of the second lane and at least one second lane.
  • the map used above is a high-precision map.
  • the map of the road section where the road surface is located can be used to locate the map of the specific road section in the vehicle map.
  • the map of the road section where the road surface is located can also be obtained from the high-precision map of the Internet by online acquisition. .
  • the characteristics of the lane described above mainly include lane line features and lane widths
  • the first determination sheet is determined by feature matching the features of the first lane with the features of at least one second lane in the map of the location of the road surface.
  • the following module may be implemented: a search module, configured to search for lane lane features and lane widths from at least one second lane to match lane lane characteristics and lane widths of the first lane.
  • the third lane is a lane corresponding to the first lane in the map; and the determining module is configured to determine that the lane in the at least one second lane that has a relative positional relationship with the third lane is a target lane of the vehicle in the map, where The relative positional relationship is a positional relationship between the driving lane of the vehicle on the traveling road surface and the first lane.
  • the searching module searches for the lane line from the at least one second lane.
  • the feature and the lane width are the third lane matching the lane line feature and the lane width of the first lane, a plurality of fourth lanes whose number and features match the plurality of first lanes are found in the plurality of second lanes.
  • the determination module determines the target lane among the plurality of fourth lanes according to the position of the traveling lane of the vehicle on the traveling road surface in the plurality of first lanes. For example, three first lanes are identified, and the driving lane is a lane in the middle position among the three adjacent first lanes, and three consecutive three lanes matching the three first lanes are determined. After the lane (ie, the fourth lane), it is possible to determine that the lane in the middle of the three fourth lanes is the target lane.
  • a plurality of first lanes are identified in the picture.
  • the plurality of first lanes are adjacent lanes, for example, three consecutive lanes, and the road width is actually five lanes, then Match these three lanes with five lanes.
  • the foregoing search module includes: a first determining sub-module, configured to determine whether a lane line feature of any one of the at least one second lane is the same as a lane line feature of the first lane; and the second determining sub-module Determining whether the difference between the lane width of any lane and the lane width of the first lane is less than a preset value; wherein, determining that the lane line feature of any lane is the same as the lane lane feature of the first lane, and any In the case where the difference between the lane width of the lane and the lane width of the first lane is less than a preset value, it is determined that any lane is a lane line feature and a lane width that matches the lane line feature and lane width of the first lane. Lane.
  • the target lane has a first confidence level for indicating the accuracy of the determination
  • the apparatus may further include: a first acquiring unit, configured to pass at least one of the feature of the first lane and the map of the location of the road surface Feature of the lane is characterized to determine that the target lane of the vehicle is in the map, and the fifth lane of the vehicle in which the positioning sensor locates the vehicle is obtained, wherein the fifth lane has a second direction for indicating the positioning accuracy.
  • Confidence a selection unit for selecting a lane in which the confidence level satisfies the preset condition from the target lane and the fifth lane is the actual lane of the vehicle in the map.
  • the positioning result of the plurality of sensors can be referred to, and the result with the higher confidence is the final result, and the driving of the vehicle can be controlled according to the result, so that the safety of the driving of the vehicle can be improved.
  • the lateral position of the vehicle may be more specifically positioned, according to the device.
  • the second determining unit is implemented. After identifying the first lane information of the traveling road surface from the image information, the second determining unit determines the vehicle and the position according to the position of the center line of the image information in the lane direction and the position of the lane line of the first lane. The distance between the lane lines of a lane.
  • the positioning of the vehicle is one of the key technologies for the vehicle to achieve automatic driving.
  • the vehicle needs to accurately sense its surroundings by locating its relative position.
  • the embodiment of the present application relates to a horizontal positioning mode based on a high-precision map, which refers to a positioning perpendicular to the traveling direction of the vehicle, for example, the vehicle is traveling on a certain lane and the specific position compared to the lane line. .
  • a high-precision map refers to a positioning perpendicular to the traveling direction of the vehicle, for example, the vehicle is traveling on a certain lane and the specific position compared to the lane line.
  • the high-precision data of the current road lane line includes the line type of the lane line (dashed line or solid line), the relative position and width between the lane line and the lane line, and the real-time detection is performed by the camera installed at the front end of the vehicle.
  • the identified plurality of lane line information matches the known information to locate which lane on the road the vehicle is traveling on, as well as the specific location of the lane line.
  • the above modules are the same as the examples and application scenarios implemented by the corresponding steps, but are not limited to the contents disclosed in the above embodiment 1. It should be noted that the foregoing module may be implemented in a hardware environment as shown in FIG. 1 as part of the device, and may be implemented by software or by hardware, where the hardware environment includes a network environment.
  • the present application also provides a server or terminal for implementing the above-described method of determining a lane.
  • the terminal may include: one or more (only one shown in the figure) processor 901, memory 903, and transmission device 905. (As in the transmitting apparatus in the above embodiment), as shown in FIG. 9, the terminal may further include an input/output device 907.
  • the memory 903 can be used to store software programs and modules, such as the program instructions and modules corresponding to the methods and devices in the embodiments of the present application.
  • the processor 901 executes various functions by running software programs and modules stored in the memory 903. Application and data processing, that is, the above method is implemented.
  • Memory 903 can include high speed random access memory, and can also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid state memory.
  • memory 903 can further include memory remotely located relative to processor 901, which can be connected to the terminal over a network. Examples of such networks include, but are not limited to, the Internet, intranets, local area networks, mobile communication networks, and combinations thereof.
  • the transmission device 905 described above is for receiving or transmitting data via a network, and can also be used for data transmission between the processor and the memory. Specific examples of the above network may include a wired network and a wireless network.
  • the transmission device 905 includes a Network Interface Controller (NIC) that can be connected to other network devices and routers via a network cable to communicate with the Internet or a local area network.
  • the transmission device 905 is a Radio Frequency (RF) module for communicating with the Internet wirelessly.
  • NIC Network Interface Controller
  • RF Radio Frequency
  • the memory 903 is used to store an application.
  • the processor 901 can call the application stored in the memory 903 through the transmission device 905 to perform the following steps: image acquisition of the traveling road surface of the vehicle to obtain image information of the traveling road surface; and identification of the first lane information of the traveling road surface from the image information.
  • the first lane information includes a feature of the first lane and the first lane for indicating a lane position of the vehicle on the road surface; and the feature of the at least one second lane in the map of the location of the roadway by the feature of the first lane Feature matching is performed to determine a target lane of the vehicle in the map, wherein the at least one second lane and the at least one second lane are characterized by being acquired from a map, and the at least one second lane comprises a target lane.
  • the processor 901 is further configured to: search for a third lane in which the lane line feature and the lane width match the lane line feature and the lane width of the first lane from the at least one second lane, wherein the third lane is the first lane a lane corresponding to the lane in the map; determining that the lane in the at least one second lane having a relative positional relationship with the third lane is a target lane of the vehicle in the map, wherein the relative positional relationship is a driving lane of the vehicle on the driving road surface The positional relationship between the first lanes.
  • the first lane information is identified by performing image processing on the image information of the traveling road surface, and then the feature of the first lane is matched with the feature of the at least one second lane in the map of the driving road surface to determine that the vehicle is in the vehicle.
  • the target lane in the map can realize the lateral positioning of the vehicle only through the existing equipment of the current automatic driving vehicle, and determine the actual lane, which can solve the technical problem of high input cost when the vehicle is accurately positioned, and then Achieve the technical effect of accurately positioning the vehicle and reducing costs.
  • the structure shown in FIG. 9 is only illustrative, and the terminal can be a smart phone (such as an Android mobile phone, an iOS mobile phone, etc.), a tablet computer, a palm computer, and a mobile Internet device (MID). Terminal equipment such as PAD.
  • FIG. 9 does not limit the structure of the above electronic device.
  • the terminal may also include more or fewer components (such as a network interface, display device, etc.) than shown in FIG. 9, or have a different configuration than that shown in FIG.
  • the application also provides a storage medium.
  • the above storage medium may be used to execute program code of the method.
  • the storage medium may be located on at least one of the plurality of network devices in the network shown in the above embodiment.
  • the storage medium is arranged to store program code for performing the following steps:
  • the storage medium is also arranged to store program code for performing the following steps:
  • the foregoing storage medium may include, but not limited to, a USB flash drive, a Read-Only Memory (ROM), a Random Access Memory (RAM), and a mobile hard disk.
  • ROM Read-Only Memory
  • RAM Random Access Memory
  • the integrated unit in the above embodiment if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in the above-described computer readable storage medium.
  • the technical solution of the embodiments of the present application may be embodied in the form of a software product in the form of a software product in essence or in the form of a contribution to the prior art, and the computer software product is stored in the storage medium.
  • a number of instructions are included to cause one or more computer devices (which may be a personal computer, server or network device, etc.) to perform all or part of the steps of the methods described in various embodiments of the present application.
  • the disclosed client may be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of the unit is only a logical function division.
  • multiple units or components may be combined or may be Integrate into another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, unit or module, and may be electrical or otherwise.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Mathematical Physics (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Game Theory and Decision Science (AREA)
  • Medical Informatics (AREA)
  • Image Analysis (AREA)

Abstract

本申请实施例公开了一种车道的确定方法和计算设备。其中,该方法包括:获取从车辆上采集的所述车辆的行驶路面的图像信息;从图像信息中识别出行驶路面的至少一个第一车道的第一车道信息,其中,第一车道信息包括车辆在行驶路面中所处的行驶车道与所述至少一个第一车道的位置关系,及所述第一车道的特征;通过将所述至少一个第一车道的特征与行驶路面所在地图中的至少一个第二车道的特征进行特征匹配以确定车辆在地图中的目标车道,其中,所述目标车道为所述行驶车道在所述地图中对应的车道,至少一个第二车道的特征为从地图中获取的。

Description

车道的确定方法、装置及存储介质
本申请要求于2017年02月07日提交中国专利局、申请号为201710073556.7、名称为“目标车道的确定方法和装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及定位技术,具体而言,涉及车道的确定方法和装置及存储介质。
背景
随着技术的进步,电子地图和车辆导航技术已经普遍应用在人们的生活中。无人驾驶技术也正在兴起。在这些技术中,对车辆进行定位是基础的、不可或缺的技术。目前的车辆定位方法主要依赖卫星或者车载传感器对车辆位置进行定位。
技术内容
根据本申请实施例提供了一种车道的确定方法,包括:获取从车辆上采集的所述车辆的行驶路面的图像信息;从图像信息中识别出行驶路面的至少一个第一车道的第一车道信息,其中,第一车道信息包括用于指示车辆在行驶路面中所处的行驶车道与所述至少一个第一车道的位置关系,及所述至少一个第一车道的特征;通过将所述至少一个第一车道的特征与行驶路面所在地图中的至少一个第二车道的特征进行特征匹配以确定车辆在地图中的目标车道,其中,所述目标车道为所述行驶车道在所述地图中对应的车道,所述至少一个第二车 道的特征为从地图中获取的。
根据本申请实施例还提供了一种计算设备,包括:处理器和存储器,所述存储器中存储有计算机可读指令,可以使所述处理器执行:获取从车辆上采集的所述车辆的行驶路面的图像信息;从所述图像信息中识别出所述行驶路面的至少一个第一车道的第一车道信息,其中,所述第一车道信息包括用于指示所述车辆在所述行驶路面中所处的行驶车道与第一车道的位置关系,及所述至少一个第一车道的特征;通过将所述至少一个第一车道的特征与所述行驶路面所在地图中的至少一个第二车道的特征进行特征匹配以确定所述车辆在所述地图中的目标车道,其中,所述目标车道为所述行驶车道在所述地图中对应的车道,所述至少一个所述第二车道的特征为从所述地图中获取的。
本申请实施例还提出了一种非易失性计算机可读存储介质,存储有计算机可读指令,可以使至少一个处理器执行以上所述的方法。
附图说明
此处所说明的附图用来提供对本申请实施例的进一步理解,构成本申请实施例的一部分,本发明申请的示意性实施例及其说明用于解释本申请,并不构成对本申请的不当限定。在附图中:
图1a是根据本申请实施例的一种实施环境示意图;
图1b是本申请实施例的一种计算设备硬件结构图;
图2a是根据本申请实施例的一种车道的确定方法的流程图;
图2b是根据本申请实施例的一种车道的确定方法的示意图;
图3为本申请实施例的一种车道确定方法的流程图;
图4a为本申请实施例的一种车道确定方法的流程图;
图4b为本申请实施例的一种车道确定方法的示意图;
图4c为本申请实施例的一种车道确定方法的流程图;
图4d为本申请实施例的一种车道确定方法的示意图;
图5a是根据本申请实施例的一种车道的确定方法的流程图;
图5b是根据本申请实施例的一种车道的确定方法的流程图;图6a是根据本申请实施例的一种图像处理的示意图;图6b是根据本申请实施例的一种图像处理的示意图;图6c是根据本申请实施例的一种车道的示意图;图7是根据本申请实施例的一种车道的确定装置的示意图;
图8是根据本申请实施例的一种的目标车道的确定装置的示意图;以及
图9是根据本发明申请实施例的一种终端的结构框图。
实施方式
所描述的实施例仅仅是本申请的部分实施例,而不是全部的实施例。图1a为本申请实施例的一种实施环境示意图。如图1a所示,服务器102可以通过网络与汽车104中的车载设备(也称为终端设备)进行连接。上述网络可以包括但不限于:广域网、城域网、局域网,等。
终端设备106可以包括,但不限于,各种车载终端,例如PC、手机、平板电脑、行车记录仪、车载导航仪等。
本申请实施例的方法可以由服务器102来执行,也可以由汽车104上的终端设备106来执行,还可以是由服务器102和终端设备106共同执行。其中,终端106执行本申请实施例的方法也可以是由安装在其上的客户端来执行。
图1b为本申请实施例的一种计算设备的结构图。该计算设备100b可以是图1a所示的实施环境100a中的终端设备106,也可以是服务器102。
参照图1b,计算设备100b可以包括以下一个或多个组件:处理器101,存储器103、通信模块105。
计算设备100b可以利用通信模块105通过某种网络与其它设备通信。
处理器101可以包括一个或多个处理器,可以包括单核或多核处理器,可以在同一个物理设备中,也可以分布在多个物理设备中。处理器101可以执行。
存储器103可以包括操作***107、通信模块108、和车道确定模块109。车道确定模块109可以包括计算机可读指令。这些计算机可读指令可以使处理器101执行本申请各实施例的车道确定方法。
图2a是本申请实施例的一种车道的确定方法的流程图,如图2a所示,该方法可以包括以下步骤。
S201:获取从车辆上采集的所述车辆的行驶路面的图像信息。
在一些实施例中,可以将图像采集设备安装在车辆上,以使其对所述车辆的行驶路面采集图像信息。其中,所述图像信息可以包括图片和视频。图像采集设备可以包括摄像头、热成像仪,等。
例如,可以将图像采集设备安装于车辆的前方或后方。例如,图像采集设备可以安装于前挡风玻璃或后挡风玻璃上沿中间的位置,使摄像头尽量平行于路面并使光轴指向行驶的正前方或正后方。这样道路的灭点在视频图像的中心附近,车辆两侧的车道线都可落在视频图像中。
S202:从所述图像信息中识别出所述行驶路面的至少一个第一车道的第一车道信息。其中,所述第一车道信息包括所述车辆在所述行 驶路面中所处的行驶车道与所述至少一个第一车道的位置关系,及所述至少一个第一车道的特征。
在一些实施例中,可以在所述图像信息中确定车辆的行驶车道,再根据确定的行驶车道和所述第一车道的特征,确定所述位置关系。这里,车辆的行驶车道是指车辆在路面中所处的车道,第一车道是指从图像信息中识别出的车道。所述位置关系为行驶车道和第一车道的位置关系。当第一车道不包括行驶车道时,位置关系可以为:行驶车道在第一车道的左侧或右侧,或位于第一车道外、左侧的第二个车道。当第一车道包括行驶车道时,位置关系可以为:行驶车道为第一车道中左起第二车道。一些实施例中,行驶车道和第一车道的位置关系也可以表示为行驶车道与第一车道的车道线的位置关系。例如,位置关系可以为,行驶车道位于第一车道的6条车道线中左2、左3两条车道线所夹的车道。
在一些实施例中,在图像信息中确定行驶车道时,可以将所述图像信息中,图像的中线所在的车道确定为所述行驶车道。例如,如图2b所示,沿车道延伸方向取图像20的中线21,将中线所在的车道L3确定为行使车道。
一些实施例中,可以将所述图像信息中,车道宽度最大的车道确定为所述行驶车道。例如,如图2b所示,可以根据各车道L1、L2、L3、L4、L5在图像中靠近车辆一侧的各车道的宽度W1、W2、W3、W4、W5确定所述车辆的行驶车道,即,车道L3的宽度W3最大,因此车道L3可以确定为行驶车道。
一些实施例中,可以根据所述图像信息中车道的形状确定所述行驶车道。例如,如图2b所示,可以在图像20中,沿与车道延伸方向垂直的方向做两条平行的辅助线22、23,得到各车道对应的梯形;在这些梯形中,将其中一条辅助线上的两个底角均为锐角(如图2b中角A1、A2)的梯形T3对应的车道L3作为行使车道。又例如,可以 做垂直于车道延伸方向的两条平行线22、23,与该两条平行线所构成的梯形面积最大的两条相邻的车道线所夹的车道可以确定为行驶车道,图2b中梯形T3面积最大,因此可以确定车道L3为行驶车道。
一些实施例中,可以根据所述图像信息中车道线的角度确定所述行驶车道。例如,可以在图像中识别出多条车道线,将相邻的两条倾斜方向相反的车道线限定的车道确定为行使车道。又例如,可以将所夹的角度最大的两条相邻车道线所限定的车道确定为行驶车道。
S203:通过将所述至少一个第一车道的特征与所述行驶路面所在地图中的至少一个第二车道的特征进行特征匹配以确定所述车辆在所述地图中的目标车道。
其中,所述目标车道为所述行驶车道在所述地图中对应的车道,至少一个所述第二车道和所述至少一个所述第二车道的特征为从所述地图中获取的。
根据各实施例的方法,通过将从车辆上采集的图像中提取的车道信息与地图中获得的该地理位置的车道信息进行特征匹配,从而确定车辆的行驶车道在地图中对应的目标车道,能够将车辆的定位精确到车道。车辆的车道信息在车辆导航、自动驾驶等技术中非常有用,能够帮助提高导航的准确度、提高自动驾驶的安全性,等。
各实施例中,可以从图像中提取第一车道的各种特征来进行车道识别,例如车道线特征、车道宽度、路面上喷涂的标识、交通标志、限速标识等。下面以采用车道线的特征作为车道特征为例来对各实施例的方案进行说明,采用其它车道特征的实现方法类似,这里不再一一赘述。图3为本申请实施例的一种车道确定方法的流程图。如图3所示,该方法可以包括以下步骤。
S301:从车辆上采集的行驶路面的图像信息中识别出所述行驶路面中第一车道的第一车道线特征,以及行驶车道与第一车道线的位置 关系。第一车道线特征可以包括一条或多条第一车道线的特征。
S302:从地图中获取车辆的地理位置所在道路的第二车道线特征。第二车道线特征可以包括一条或多条第二车道线的特征。
S303:利用第二车道线特征、第一车道线特征及上述位置关系,确定所述行驶车道在所述地图中对应的所述目标车道。
各实施例中,利用从图像中识别出的车道线的特征来与地图中相应路面的车道线的特征进行特征匹配,从而确定车辆的行驶车道在地图中对应的目标车道,可以减少确定车道所需的计算量。
各实施例中,在S303利用车道线特征来确定目标车道可以有很多种方法,下面举几个例子。图4a为本申请实施例的一种车道确定方法的流程图。如图4a所示,该方法可以包括以下步骤。
S401:通过对比第一车道线特征与第二车道线特征,确定第二车道线中与第一车道线对应的第三车道线。
S402:根据第三车道线及行使车道与第一车道线的位置关系,确定行驶车道在地图中对应的目标车道。
一些实施例中,当第一车道线特征包括一条第一车道线的线条类型,第二车道线特征包括一条或多条第二车道线的线条类型时,可以将所述一条或多条第二车道线中具有所述第一车道线的线条类型的车道线作为所述第三车道线;根据所述第三车道线及所述位置关系,确定所述行驶车道在所述地图中对应的所述目标车道。
各实施例中,车道线的线条类型可以包括实线、虚线、双实线、直线、曲线,等。
例如,当从图像信息中只识别出了一条车道线,其线条类型为虚线,从所述图像信息中识别出所述车辆的行驶车道与该条车道线的位置关系为行驶车道位于该车道线的左边。从地图中获取该道路的3条车道线,其线条类型分别为实线、虚线和实线。可以将上述3条车道 线中,具有虚线的车道线作为图像中识别出的第一车道线在地图中对应的车道线,即第三车道线。根据该第三车道线及上述位置关系,确定该地图中,第三车道线左边的车道为目标车道。
一些实施例中,当第一车道线特征包括多条第一车道线的线条类型及排列方式,第二车道线特征包括多条第二车道线的线条类型及排列方式时,可以将所述多条第二车道线中具有上述多条第一车道线的线条类型及排列方式的多条车道线作为所述第三车道线。根据确定的第三车道线及所述位置关系,确定所述行驶车道在地图中对应的目标车道。
例如,当从摄像头采集的图像信息中识别出了多条所述第一车道线且所述第一车道线特征中线条类型及排列方式为实线、虚线、虚线和虚线,并且从所述图像信息中识别出所述车辆与四条所述第一车道线的位置关系为所述车辆位于四条所述第一车道线的中间,即前两个线条类型对应的车道线为车辆左侧的两条车道线,后两个线条类型对应的车道线为车辆右侧的两条车道线;同时,地图提供了第二车道为一条拥有5条车道的道路,也即6条第二车道线,所述第二车道线的第二车道线特征为实线、虚线、虚线、虚线、虚线和实线。因此可以将上述6条第二车道线中,具有实线、虚线、虚线和虚线这样的线条类型和排列方式的第一车道线特征的车道线作为第三车道也即第一至四条车道线为所述第三车道。根据所确定的第三车道线及上述位置关系可以确定出,所述车辆位于所述第三车道线的中间,也即位于第二条车道线的右边或第三条车道线的左边,因此可以确定出目标车道为地图中的左侧起第二条车道。
一些例子中,可以用符号来表示车道线的线条类型,用符号串来表示多条车道线的排列方式。例如,可以根据预设的线条类型与符号的对应关系,将所述第一车道线的线条类型对应的符号按照所述第一车道线的排列方式组织起来,生成表示所述第一车道线特征的第一符 号串;根据所述对应关系,将所述第二车道线的线条类型对应的符号按照所述第二车道线的排列方式组织起来,生成表示所述第二车道线特征的第二符号串;将所述第一符号串与所述第二符号串进行对比,将第二符号串中与第一符号串相同的多个符号对应多条车道线作为所述第三车道线;根据所述第三车道线及所述位置关系,确定所述行驶车道在所述地图中对应的所述目标车道。例如,如图4b所示,从图像信息中识别出的4条第一车道线J1、J2、J3、J4的线条类型和排列方式为{J1实线,J2虚线,J3虚线,J4虚线}。当用0表示虚线,1表示实线时,可以得到表示第一车道线特征的第一符号串为“1000”。同样地,从地图中获取到6条第二车道线K1、K2、K3、K4、K5、K6,线条类型和排列方式为{K1实线,K2虚线,K3虚线,K4虚线,K5虚线,K6实线},可以得到表示第二车道线特征的第二符号串为“100001”。将第一符号串“1000”与第二符号串“100001”进行对比,可以确定第二符号串“100001”中的前四个符号与第一符号串“1000”匹配,因此可以确定第二符号串中的前四个符号对应的4条车道线K1、K2、K3、K4为第一车道线J1、J2、J3、J4在地图中对应的第三车道。再根据从图像中识别出的行驶车道与第一车道线的位置关系为行驶车道在左起第二、三车道线间,可以确定第三车道中左起第二、三车道线间的车道为目标车道,即K2、K3间的车道为目标车道。
图4c为本申请实施例的一种车道确定方法的流程图。如图4c所示,该方法可以包括以下步骤。
S411:根据第一车道线特征及行使车道与第一车道线的位置关系确定行驶车道的车道描述。
S412:根据第二车道线的特征确定至少一个第二车道中每个第二车道的车道描述。
这里,一个车道的车道描述可以包括一条或多条车道线的特征及 该一条或多条车道线与所述车道的位置关系。
S413:将行驶车道的车道描述分别与每个第二车道的车道描述进行对比,将符合所述行驶车道的车道描述的第二车道确定为所述目标车道。
一些实施例中,当行驶车道的车道描述包括一条第一车道线的线条类型及第一车道线与行驶车道的位置关系时,确定每个第二车道的车道描述。每个第二车道的车道描述可以包括一条第二车道线的线条类型及该第二车道线与该第二车道的位置关系。可以将所述至少一个第二车道中,车道描述中的线条类型及位置关系与所述行驶车道的车道描述中的线条类型及位置关系一致的车道确定为所述目标车道。例如,当从图像信息中只识别出了一条第一车道线且第一车道线的线条类型为虚线,且从所述图像信息中识别出所述第一车道线与行驶车道的位置关系为行驶车道位于该第一车道线的左边,可以确定所述行驶车道的车道描述,例如,为“虚左”。从地图中获取到3条第二车道线,且第二车道线的线条类型和排列方式为实线、虚线和实线,则可以确定这3条第二车道线限定的两个第二车道A、B中,车道A的车道描述为“实左、虚右”,车道B的车道描述为“虚左、实右”。将行驶车道的车道描述与车道A、车道B的车道描述分别进行对比,可以确定第二车道中车道B的车道描述与所述行驶车道的特征一致,因此确定车道B为行驶车道在地图中对应的目标车道。
一些实施例中,当行驶车道的车道描述包括多条第一车道线的线条类型、第一排列方式及所述多条第一车道线与行驶车道的位置关系时,确定所述每个第二车道的车道描述。每个第二车道的车道描述包括多条第二车道线的线条类型、第二排列方式及所述多条第二车道线与所述第二车道的位置关系。将至少一个第二车道中,车道描述中的线条类型、排列方式及位置关系与所述行驶车道的车道描述中的所述线条类型、所述第一排列方式及所述位置关系一致的车道确定为所述 目标车道。例如,当从图像信息中识别出了4条第一车道线,其线条类型及排列方式为{实线,虚线,虚线,虚线},并且行驶车道位于4条第一车道线的中间两条车道线之间,因此可以确定行驶车道的车道描述,例如,可以为“实线,虚线,行驶车道,虚线,虚线”。从地图中获取到6条第二车道线及其所限定的5条第二车道,6条第二车道线的线条类型和排列方式为{实线,虚线,虚线,虚线,虚线,实线}。可以确定出5个第二车道的车道描述,分别为车道A{实线,车道A,虚线,虚线,虚线,虚线,实线},车道B{实线,虚线,车道B,虚线,虚线,虚线,实线},车道C{实线,虚线,虚线,车道C,虚线,虚线,实线},车道D{实线,虚线,虚线,虚线,车道D,虚线,实线},以及车道E{实线,虚线,虚线,虚线,虚线,车道E,实线}。通过对比可以确定车道B的车道描述与行驶车道的车道描述相符合,因此可以确定车道B为行驶车道在地图中对应的目标车道。
一些实施例中,可以根据预设的线条类型及位置关系与符号的对应关系,将所述行驶车道的车道描述中的各第一车道线对应的符号按照所述第一排列方式组织起来,生成表示所述行驶车道线的车道描述的第一符号串;根据所述对应关系以及各第二车道的车道描述中的所述多条第二车道线对应的符号按照所述第二排列方式组织起来,生成表示各第二车道的车道描述的第二符号串;将所述第一符号串分别与各第二车道的所述第二符号串进行对比,将与第一符号串相符合的第二符号串对应的第二车道确定为所述目标车道。例如,如图4d所示,从图像信息中识别出的4条第一车道线J1、J2、J3、J4的线条类型和排列方式为{J1实线,J2虚线,J3虚线,J4虚线},行驶车道L0与第一车道线的位置关系为行驶车道在车道线J2、J3之间。当用0表示虚线、1表示实线、Q表示车道时,可以得到表示行驶车道的车道描述的第一符号串为“10Q00”。同样地,从地图中获取到6条第二车道线K1、K2、K3、K4、K5、K6,线条类型和排列方式为{K1实线,K2虚线,K3虚线,K4虚线,K5虚线,K6实线},可以得到5条第二 车道的车道描述的第二符号串分别为L1:“1Q00001”,L2:“10Q0001”,L3:“100Q001”,L4:“1000Q01”,L5:“10000Q1”。将第一符号串与各第二符号串进行对比,可以确定车道L2的符号串“10Q0001”与行驶车道L0的第一符号串“10Q00”相符合,因此可以确定L2为行驶车道L0在地图中对应的目标车道。
在一些实施例中,从地图中获取的第二车道或第二车道线,可以是根据所述车辆的行驶方向,从所述地图中获取与车辆行驶方向一致的车道或车道线。
图5a是根据本申请实施例的一种可选的车道的确定方法的流程图,如图5a所示,该方法可以包括以下步骤:
步骤S501,对车辆的行驶路面进行图像采集得到行驶路面的图像信息;
步骤S502,从图像信息中识别出行驶路面的第一车道信息,第一车道信息包括用于指示车辆在行驶路面的车道位置的第一车道和第一车道的特征;
步骤S503,通过将第一车道的特征与行驶路面所在地图中的至少一个第二车道的特征进行特征匹配确定车辆在地图中的目标车道,至少一个第二车道和至少一个第二车道的特征为从地图中获取的,至少一个第二车道包括目标车道。
通过上述步骤S501至步骤S503,通过对行驶路面的图像信息进行图像处理识别出第一车道信息,然后将第一车道的特征与行驶路面所在地图中的至少一个第二车道的特征进行特征匹配确定车辆在地图中的目标车道,仅通过目前的自动行驶车辆的已有设备即可实现车辆的横向定位,确定其实际所在的车道,可以解决了车辆进行准确定 位时的投入成本较高的技术问题,进而达到对车辆进行准确定位并降低成本的技术效果。
上述的行驶路面为车辆当前行驶的路面;上述的图像信息为具有行驶路面的各个特征的图像,如彩色或者黑白图片、彩色或者黑白视频、热成像图片等;上述的特征包括车道线特征、车道宽度、车道类型、道路标识、交通标志、限速标识等,车道线特征至少包括虚线、实线、直线、曲线,等。车道类型可以包括高速公路、国道等等。上述的车辆可以为自动行驶车、无人驾驶车辆、各类型的机动车等。
上述的第一车道信息所指示的第一车道可以为车辆实际行驶的车道,也可以是和车辆实际行驶的车道具有相对位置关系的任意车道。
本申请实施例的上述方法主要用于车辆的横向定位,包括但不局限于用于自动行驶车辆的横向定位。
需要说明的是,由于车辆可能存在压线(即车道线)的情况下,为了便于描述,在本申请实施例中,在车辆压线的情况下,只要车辆的宽度在某一车道中达到了50%,即可认为车辆处于该车道中。
无人驾驶汽车自有汽车工业以来,便一直是各国汽车产业菁英所极力挑战的热门项目之一。由于无人驾驶汽车在危险工作环境,以至于军事上应用的具有无穷的应用前景,亦吸引各国政府的投入,无奈由于计算机运算能力与控制***设计的问题,目前自动驾驶的车辆,均局限于封闭的场所或是特定目的使用,随着技术的发展,近年来,众多汽车公司开始不约而同地展示自动驾驶的科技,甚至有一些汽车大厂更预言在2020年就可以将此技术普及。
对于无人驾驶汽车,定位的准确与否直接影响行驶的安全性,相关技术方案中均为无人驾驶汽车定位的完全定位解决方案,既包括横向定位,也包括纵向定位,并且都隐含了GPS/BDS的应用。本申请 实施例的技术方案专注于解决无人驾驶汽车的横向定位,弱化了GPS/BDS的应用相关的描述,在GPS/BDS定位精度不高时,是对GPS/BDS定位的一种辅助。对于无人驾驶汽车而言,车辆的横向定位相较于纵向定位来说更为基础和重要,尤其是在车辆行驶过程中更是如此。这是因为无人驾驶汽车是按照预先在高精度地图上规划好的路径行驶的,路径至少要精确到车道级。只要车辆没有遇到突发情况或路口,就可一直沿着该路径行驶下去,因此在纵向行驶方向上,车辆相较于其实际位置超前一些或是滞后一些,都对无人驾驶汽车影响不大,但是在横向上就需要有精确的位置信息提供给车辆,车辆只有知道当前所在车道的具***置,才能决策和规划下一步的行驶动作。利用本申请实施例的技术方案,可以实现车辆在横向位置上的准确定位,下面结合图2详述本申请的实施例:
在步骤S501提供的技术方案中,在对车辆的行驶路面进行图像采集得到行驶路面的图像信息时,可以利用车载相机采集车辆行驶路面的图片或者视频;还可以利用车载热成像仪采集车辆行驶路面的热成像图。
例如,可以将检测车道线的摄像头安装于车辆的前方,一般安装于前挡风玻璃上沿中间的位置,使摄像头尽量平行于路面并使光轴指向行驶的正前方,这样道路的灭点在视频图像的中心附近,车辆两侧的车道线都可落在视频图像中。优选地,选择摄像头的等效焦距不应过大,以保证足够的视角将车辆两侧的车道线拍摄进去。
需要说明的是,一旦图像采集装置(如摄像头)在车辆上安装固定之后,那么采集装置的采集区域也就确定了,也即车辆在采集区域中的车道位置是相对固定的,例如对于上述安装在车辆正前方中心的摄像头,若其采集宽度为5条车道,那么采集宽度的中间位置所在的车道即车辆所在的车道。
在步骤S502提供的技术方案中,在从图像信息中识别出行驶路 面的第一车道信息时,可对图像信息进行正射投影处理;从经过正射投影处理后的图像信息进行特征提取;将经过特征提取后的图像信息进行透视反投影处理。
在上述的透视投影图像中,车道线与线型干扰物不易分辨出来,尤其是虚线类型的车道线,在图像中往往以短线段形式出现,这样很难和一些线状的干扰物完全区分开来。正射投影处理过后,虚线类型的车道线被映射成长线段并与其它车道线相平行,如图5所示,而线状干扰物相较于其它车道线或是不平行或是间距有异常,这样就很容易的将其剔除。
通过上述的图像处理,可以识别出图像中的每条车道和每条车道的车道线特征(如虚实线、线宽等)。
在从图像信息中识别出行驶路面的第一车道信息时,可以识别出图像中的一个或者多个车道(即第一车道),通过这一个或者多个车道来确定目标车道,优选地,为了提高识别准确度,可以识别出图像中的全部第一车道,这样,通过比较多个车道的特征,可以更为准确的确定车辆实际所在的车道。
在步骤S503提供的技术方案中,在通过将第一车道的特征与行驶路面所在地图中的至少一个第二车道的特征进行特征匹配确定车辆在地图中的目标车道之前,可以通过如下方式获取第二车道的特征:获取车辆的卫星定位信息,也即通过车载GPS/BDS/GLONASS等卫星定位***获取车辆的卫星定位信息,受限于定位准确度、环境等因素的影响,卫星定位信息仅可较为粗略的表示车辆当前的位置,其精确度较低,因此需要进一步进行本申请的精确定位;根据卫星定位信息获取行驶路面所在路段的地图,地图中携带有至少一个第二车道和至少一个第二车道的特征。
上述使用到的地图为高精度地图,获取行驶路面所在路段的地图可以在车载地图中定位到具体路段的地图;也可以通过在线获取的方 式从互联网的高精度地图中获取行驶路面所在路段的地图。
上述的车道的特征主要包括车道线特征和车道宽度,在通过将第一车道的特征与行驶路面所在地图中的至少一个第二车道的特征进行特征匹配确定车辆在地图中的目标车道时,可以通过如下方式实现:从至少一个第二车道中查找车道线特征和车道宽度与第一车道的车道线特征和车道宽度匹配的第三车道,第三车道为第一车道在地图中对应的车道,例如,从第二车道中查找到车道线为实线,车道宽度为3米的第三车道;确定至少一个第二车道中与第三车道具有相对位置关系的车道为车辆在地图中的目标车道,相对位置关系是车辆在行驶路面上的行驶车道与第一车道之间的位置关系,例如,识别出的第一车道为车辆实际行驶车道左侧的车道,那么地图中第三车道左侧的车道即为目标车道。
在一些实施例中,在第一车道为一个的情况下,若只考虑车道虚实线的特征可能难以匹配到唯一结果,因此,还可以比对虚实线的宽度等特征,以便于根据一条车道特征匹配到唯一结果。
在一些实施例中,在第一车道和第二车道为多个,且多个第一车道包括车辆在行驶路面上的行驶车道的情况下,在从至少一个第二车道中查找车道线特征和车道宽度与第一车道的车道线特征和车道宽度匹配的第三车道时,在多个第二车道中查找到数量和特征与多个第一车道匹配的多个第四车道。
也即在图片中识别出的是多个第一车道,一般情况下这多个第一车道为相互邻接的车道,例如,为连续的三个车道,而路宽实际为五个车道,那么可以将这三个车道与五个车道进行匹配,匹配的过程中可以用这三个车道的最左侧的车道线对准五个车道的最左侧的车道线进行匹配,每次匹配完成后向右移动一条车道继续进行匹配,直至找到五个车道中与这三个车道特征完全一致的连续的三个车道;同理,也可以从最右侧的车道线开始匹配。
在从至少一个第二车道中查找车道线特征和车道宽度与第一车道的车道线特征和车道宽度匹配的第三车道时,可判断至少一个第二车道中的任一车道的车道线特征与第一车道的车道线特征是否相同,即判断是否均为实线或者均为虚线;由于车道宽度可能存在一定的误差,因此在比较车道宽度时,可判断任一车道的车道宽度与第一车道的车道宽度之间的差值是否小于预设值;在判断出任一车道的车道线特征与第一车道的车道线特征相同,且任一车道的车道宽度与第一车道的车道宽度之间的差值小于预设值的情况下,确定任一车道为车道线特征和车道宽度与第一车道的车道线特征和车道宽度匹配的第三车道。
在确定至少一个第二车道中与第三车道具有相对位置关系的车道为车辆在地图中的目标车道时,根据车辆在行驶路面上的行驶车道在多个第一车道中的位置,在多个第四车道中确定目标车道。例如,识别了三个第一车道,行驶车道为这三个相邻的第一车道中处于中间位置的车道,确定了上述的五个车道中与这三个第一车道匹配的连续的三个车道(即第四车道)之后,即可以确定三个第四车道中处于中间位置的车道为目标车道。
需要说明的是,上述目标车道具有用于指示确定准确度的第一置信度,在通过将第一车道的特征与行驶路面所在地图中的至少一个第二车道的特征进行特征匹配确定车辆在地图中的目标车道之后,获取定位传感器对车辆进行定位得到的车辆在地图中的第五车道,第五车道具有用于指示定位准确度的第二置信度;从目标车道和第五车道中选择置信度满足预设条件的车道为车辆在地图中的实际车道,预设条件为用于确定车辆在地图中的实际车道的筛选条件。
在车辆的智能行驶***中,可以参考多个传感器的定位结果,将各个传感器的定位结果依据置信度融合在一起,最终确定车辆的具***置,例如,以置信度较高的车道为最终的结果;在多个置信度比较 接近的情况下,若其中多个置信度对应于同一个车道,那么可以将该车道作为最终识别结果,按照该结果来控制车辆的行驶,可以提高车辆行驶的安全性。
在一些实施例中,在本申请实施例的技术方案中,除了可以通过上述的方法确定车辆的行驶车道外,还可以更为具体的对车辆的横向位置进行定位,在从图像信息中识别出行驶路面的第一车道信息之后,可以根据图像信息在车道方向上的中心线的位置(也即车辆在图像中的相对位置)和第一车道的车道线的位置确定车辆与第一车道的车道线之间的距离。
例如,若摄像头安装于车辆的前方的中心位置,那么其采集宽度中的中间位置即为车辆所在的位置,也即对于任意采集到的图像,该图像在横向上的中心位置即车辆的位置,在识别出车道线之后,即可根据该中心位置、车道线、车宽计算出车辆与车道线的间距。以便于控制车辆在横向上的移动。若摄像头不是安装在车辆的中心位置,由于摄像头与车辆中心位置间的距离是确定的,也可以采用上述的方法实现。
本申请实施例的技术方案可应用于无人驾驶汽车的定位需求,可以和其它车辆定位方案一起融合,整体提高车辆定位的精度和稳定性,也可用于ADAS(高级辅助驾驶***)的车道保持及变道提醒中,保证车辆的安全驾驶。
下面结合具体的实施方式详述本申请的实施例:
对于无人驾驶而言,除了高精度地图以外,业内普遍认为定位、感知、决策与控制是无人驾驶汽车构成的四大模块。传统的车辆自定位通过一个普通的GPS即可完成,一般它的精度在1~10m左右,这样的精度就连主路还是辅路、桥上还是桥下等都是无法分辨清楚的, 更不用说是哪条车道了,而在自动驾驶中,为了达到较高的定位精度,常见的一种方案是通过高精度的差分“GPS+惯导IMU+地面基站”联合完成,但动辄几十万的硬件成本压力,在现有情况下很难完成自动驾驶车辆的量产需求。本申请实施例所采用的技术方案相较于前一种技术方案,成本得到了极大的控制,它由“高精度地图+普通GPS+摄像头”联合完成。
普通GPS负责粗定位,它的定位精度一般在1~10m左右,这样的定位精度不足以满足无人驾驶车辆的需求,但是足以通过它来定位车辆周围高精度地图中一段道路数据,作为车辆定位的参考。接下来利用摄像头检测出的车道线进行的精确定位,即是在这段道路数据中找寻最佳匹配的位置。下面结合图5b所示的步骤详述该实施方式,具体的定位步骤如下:
步骤S511,通过GPS获取车辆当前位置的坐标。
步骤S512,由当前位置的坐标确定车辆周围一段高精度地图中的道路信息,包括共几条车道线及每条车道线的线型和相距宽度。
步骤S513,根据摄像头检测出的当前车辆左右两侧最近的各二条车道线(共四条车道线,如果有的话),判断各条车道线的线型。
通常情况下,摄像头对所述车辆的行驶路面进行实时采集,得到所述行驶路面的图像信息,终端上运行的客户端对采集的所述图像信息进行识别,并判断图像中各条车道线的车道信息,比如车道线特征和车道宽度等。
步骤S514,将检测出的车道线线型依次与之前获得的高精度地图中的车道线线型进行比对,由此确定车辆当前所在车道的位置。
步骤S515,当车辆变道时,车辆距离左右两侧车道线的距离会发生突变,据此更新车辆所载车道的位置。
步骤S516,对于当前定位的车辆横向位置,依据与先前对比的 位置变化及检出车道线的长度赋以置信度。
在一些实施例中,所述车辆上配置的终端设备上运行的客户端将摄像头采集的所述车辆的行驶路面的图像信息的当前帧与上一帧进行对比,如果当前帧中车辆的位置与上一帧中车辆的位置发生改变,则将当前车辆所处的车道的置信度降低;或者,对所述客户端在所述车辆的行驶路面的图像信息识别出的车道线类型赋予不同的置信度,对于识别出的车道线类型为实线的车道线赋予高的置信度,对于识别出的车道线类型为虚线的车道线赋予低的置信度。步骤S517,当置信度值大于某一阈值时,执行车道保持策略,从而暂停车道线比对。
而当置信度小于等于某一阈值时,重复上述步骤。
在上述的步骤S513中,可以按照如图6a所示的方式进行图像处理:视频图像截取、图像彩色转灰度、图像平滑去噪、图像Gamma校正、图像二值化处理、形态学修补、骨架提取、Hough滤波、正射投影、车道线矫正、Hough提取车道线、透视反投影、车道线融合、Kalman滤波、缺失车道线估计、车道线虚实判断以及车辆位置估计。
在视频图像(如图6b中子图a所示)中,不仅包含道路部分的影像,还包含有道路两侧景物及天空影像,因此车道线检测的第一步即是截取视频图像(如图6b中子图b所示),只保留感兴趣部分区域。然后将彩色图像变换成灰度图像,接着采用双边滤波法对图像进行平滑去噪。为提高对不同光线图像的适应性,在对图像进行二值化之前,先对图像进行Gamma校正。图像二值化(如图6b中子图c所示)之后,先利用形态学操作修补空洞,平滑边界,再通过骨架提取算法提取车道线中心线(如图6b中子图d所示),在此基础上利用Hough变换结果进行局部滤波,去除干扰和毛刺。这时得到的是车道线的透视投影图像(如图6b中子图e所示),依据相机内外参数将其变换为正射投影图像(如图6b中子图f所示),再经形态学平滑和骨架提取,便得到车道线的正视图。在正视图中,先做车道线矫正,这主要是为 处理弯道,接着Hough变换提取车道线(如图6b中子图g所示),再依据车道线之间的距离约束,去掉错误提取的车道线(如图6b中子图h所示)。最后,将剩余的车道线反投影回透视投影图像中,并与先前的透视投影图像中的车道线进行融合,得到最终的检测车道线。
为了输出连续稳定的车道线,可对每一条车道线进行Kalman滤波处理,并对由于遮挡等原因而短暂缺失的车道线进行位置估计。最后,基于连续帧图像中的车道线的连续性及车道线的长度,判断每条车道线的虚实。
在步骤S514提高的实施例中,道路的车道线布局形如图6c所示,将车道线用0和1进行编码,即车道线为虚线时标记为0,车道线为实线时标记为1,那么一条拥有5条车道(即编码为1至5的车道)的道路可以用100001来表示,道路的这些信息是由高精度地图提供的。类似地,由摄像头检测出的车道线同样对其进行编码,例如,检出四条车道线的编码为1000,其中前两位10代表车辆左侧的两条车道线,后两位00代表车辆右侧的两条车道线,那么车辆当前的纵向位置即在第2条车道内。类似的,当检出的四条车道线编码为0000时,车辆当前的纵向位置即在第3条车道内。
车辆的定位是车辆实现自动驾驶的关键技术之一,车辆需要通过定位它的相对位置来精确感知它的周围环境。本申请实施例涉及一种基于高精度地图的横向定位方法,所谓横向定位,是指垂直于车辆行进方向上的定位,例如车辆是在某一车道上行驶,以及相较于车道线的具***置。它作为对GPS/BDS定位的一种辅助或替代,在卫星信号不好或无法获得的情况下,可以有效提高车辆定位的精度。已知当前道路车道线的高精度数据,包括车道线的线型(虚线或实线),车道线与车道线之间的相对位置及宽度等信息后,通过安装在车辆前端的摄像头实时检测并识别到的多条车道线信息与已知的信息相匹配,从而定位出车辆在道路上的哪条车道上行驶,以及相较于车道线的具 ***置。
在本申请实施例中,通过对行驶路面的图像信息进行图像处理识别出第一车道信息,然后将第一车道的特征与行驶路面所在地图中的至少一个第二车道的特征进行特征匹配以确定车辆在地图中的目标车道,仅通过目前的自动行驶车辆的已有设备即可实现车辆的横向定位,确定其实际所在的车道,可以解决了车辆进行准确定位时的投入成本较高的技术问题,进而达到对车辆进行准确定位并降低成本的技术效果。
需要说明的是,对于前述的各方法实施例,为了简单描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本申请实施例并不受所描述的动作顺序的限制,因为依据本申请实施例,某些步骤可以采用其他顺序或者同时进行。其次,本领域技术人员也应该知悉,说明书中所描述的实施例均属于优选实施例,所涉及的动作和模块并不一定是本申请实施例所必须的。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到根据上述实施例的方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,或者网络设备等)执行本申请实施例各个实施例所述的方法。
本申请还提供了一种用于实施上述车道的确定方法的车道的确定装置。图7是根据本申请实施例的一种车道的确定装置的示意图,如图7所示,该装置可以包括:一个或一个以上存储器;一个或一个以上处理器;其中,所述一个或一个以上存储器存储有一个或者一个以上指令模块,经配置由所述一个或者一个以上处理器执行;其中, 所述一个或者一个以上指令模块包括:采集单元72、识别单元74以及第一确定单元76。
采集单元72,用于对车辆的行驶路面进行图像采集得到行驶路面的图像信息;
识别单元74,用于从图像信息中识别出行驶路面的第一车道信息,其中,第一车道信息包括用于指示车辆在行驶路面的车道位置的第一车道和第一车道的特征;
第一确定单元76,用于通过将第一车道的特征与行驶路面所在地图中的至少一个第二车道的特征进行特征匹配确定车辆在地图中的目标车道,其中,至少一个第二车道和至少一个第二车道的特征为从地图中获取的,至少一个第二车道包括目标车道。
需要说明的是,该实施例中的采集单元72可以用于执行本申请实施例1中的步骤S511,该实施例中的识别单元74可以用于执行本申请实施例1中的步骤S512,该实施例中的第一确定单元76可以用于执行本申请实施例1中的步骤S513。
此处需要说明的是,上述模块与对应的步骤所实现的示例和应用场景相同,但不限于上述实施例1所公开的内容。需要说明的是,上述模块作为装置的一部分可以运行在如图1所示的硬件环境中,可以通过软件实现,也可以通过硬件实现。
通过上述模块,通过对行驶路面的图像信息进行图像处理识别出第一车道信息,然后将第一车道的特征与行驶路面所在地图中的至少一个第二车道的特征进行特征匹配确定车辆在地图中的目标车道,仅通过目前的自动行驶车辆的已有设备即可实现车辆的横向定位,确定其实际所在的车道,可以解决了车辆进行准确定位时的投入成本较高的技术问题,进而达到对车辆进行准确定位并降低成本的技术效果。
上述的行驶路面为车辆当前行驶的路面;上述的图像信息为具有 行驶路面的各个特征的图像,如彩色或者黑白图片、彩色或者黑白视频、热成像图片等;上述的特征包括车道线特征、车道宽度等,车道线特征至少包括虚线、实线;上述的车辆可以为自动行驶车、各类型的机动车等。
上述的第一车道信息所指示的第一车道可以为车辆实际行驶的车道,也可以是和车辆实际行驶的车道具有相对位置关系的任意车道。
本申请实施例的上述装置主要用于车辆的横向定位,包括但不局限于用于自动行驶车辆的横向定位。
需要说明的是,由于车辆可能存在压线(即车道线)的情况下,为了描述方便,在本申请实施例中,在车辆压线的情况下,只要车辆的宽度在某一车道中达到了50%,即可认为车辆处于该车道中。
无人驾驶汽车自有汽车工业以来,便一直是各国汽车产业菁英所极力挑战的热门项目之一。由于无人驾驶汽车对于危险工作环境,以至于军事上应用的无穷潜力,亦吸引各国政府的投入,无奈由于计算机运算能力与控制***设计的问题,目前自动驾驶的车辆,均局限于封闭的场所或是特定目的使用,随着技术的发展,近年来,众多汽车公司开始不约而同地展示自动驾驶的科技,甚至有一些汽车大厂更预言在2020年就可以将此技术普及。
在一些实施例中,识别单元包括:第一处理模块,用于对图像信息进行正射投影处理;提取模块,用于从经过正射投影处理后的图像信息进行特征提取;第二处理模块,用于将经过特征提取后的图像信息进行透视反投影处理。
通过上述的图像处理,可以识别出图像中的每条车道和每条车道的车道线特征(如虚实线、线宽等)。
在一些实施例中,如图8所示,本申请实施例的装置还可以包括: 第二获取单元82,用于在通过将第一车道的特征与行驶路面所在地图中的至少一个第二车道的特征进行特征匹配确定车辆在地图中的目标车道之前,获取车辆的卫星定位信息;第三获取单元84,用于根据卫星定位信息获取行驶路面所在路段的地图,其中,地图中携带有至少一个第二车道和至少一个第二车道的特征。
上述使用到的地图为高精度地图,获取行驶路面所在路段的地图可以在车载地图中定位到具体路段的地图;也可以通过在线获取的方式从互联网的高精度地图中获取行驶路面所在路段的地图。
在一些实施例中,上述的车道的特征主要包括车道线特征和车道宽度,第一确定单在通过将第一车道的特征与行驶路面所在地图中的至少一个第二车道的特征进行特征匹配确定车辆在地图中的目标车道时,可以通过如下模块实现:查找模块,用于从至少一个第二车道中查找车道线特征和车道宽度与第一车道的车道线特征和车道宽度匹配的第三车道,其中,第三车道为第一车道在地图中对应的车道;确定模块,用于确定至少一个第二车道中与第三车道具有相对位置关系的车道为车辆在地图中的目标车道,其中,相对位置关系是车辆在行驶路面上的行驶车道与第一车道之间的位置关系。
在一些实施例中,在第一车道和第二车道为多个,且多个第一车道包括车辆在行驶路面上的行驶车道的情况下,查找模块在从至少一个第二车道中查找车道线特征和车道宽度与第一车道的车道线特征和车道宽度匹配的第三车道时,在多个第二车道中查找到数量和特征与多个第一车道匹配的多个第四车道。
此时,确定模块根据车辆在行驶路面上的行驶车道在多个第一车道中的位置,在多个第四车道中确定目标车道。例如,识别了三个第一车道,行驶车道为这三个相邻的第一车道中处于中间位置的车道,确定了上述的五个车道中与这三个第一车道匹配的连续的三个车道(即第四车道)之后,即可以确定三个第四车道中处于中间位置的车 道为目标车道。
也即在图片中识别出的是多个第一车道,一般情况下这多个第一车道为相互邻接的车道,例如,为连续的三个车道,而路宽实际为五个车道,那么可以将这三个车道与五个车道进行匹配,匹配的过程中可以用这三个车道的最左侧的车道线对准五个车道的最左侧的车道线进行匹配,每次匹配完成后向右移动一条车道继续进行匹配,直至找到五个车道中与这三个车道特征完全一致的连续的三个车道;同理,也可以从最右侧的车道线开始匹配。
具体地,上述的查找模块包括:第一判断子模块,用于判断至少一个第二车道中的任一车道的车道线特征与第一车道的车道线特征是否相同;第二判断子模块,用于判断任一车道的车道宽度与第一车道的车道宽度之间的差值是否小于预设值;其中,在判断出任一车道的车道线特征与第一车道的车道线特征相同,且任一车道的车道宽度与第一车道的车道宽度之间的差值小于预设值的情况下,确定任一车道为车道线特征和车道宽度与第一车道的车道线特征和车道宽度匹配的第三车道。
上述的目标车道具有用于指示确定准确度的第一置信度,该装置还可以包括:第一获取单元,用于在通过将第一车道的特征与行驶路面所在地图中的至少一个第二车道的特征进行特征匹配确定车辆在地图中的目标车道之后,获取定位传感器对车辆进行定位得到的车辆在地图中的第五车道,其中,第五车道具有用于指示定位准确度的第二置信度;选择单元,用于从目标车道和第五车道中选择置信度满足预设条件的车道为车辆在地图中的实际车道。
在车辆的智能行驶***中,可以参考多个传感器的定位结果,并以其中置信度较高的为最终的结果,按照该结果来控制车辆的行驶,可以提高车辆行驶的安全性。
在一些实施例中,在本申请实施例的技术方案中,除了可以通过 上述的方法确定车辆的行驶车道外,还可以更为具体的对车辆的横向位置进行定位,具体可以根据该装置的第二确定单元实现,第二确定单元在从图像信息中识别出行驶路面的第一车道信息之后,根据图像信息在车道方向上的中心线的位置和第一车道的车道线的位置确定车辆与第一车道的车道线之间的距离。
车辆的定位是车辆实现自动驾驶的关键技术之一,车辆需要通过定位它的相对位置来精确感知它的周围环境。本申请实施例涉及一种基于高精度地图的横向定位方式,所谓横向定位,是指垂直于车辆行进方向上的定位,例如车辆是在某一车道上行驶,以及相较于车道线的具***置。它作为对GPS/BDS定位的一种辅助或替代,在卫星信号不好或无法获得的情况下,可以有效提高车辆定位的精度。已知当前道路车道线的高精度数据,包括车道线的线型(虚线或实线),车道线与车道线之间的相对位置及宽度等信息后,通过安装在车辆前端的摄像头实时检测并识别到的多条车道线信息与已知的信息相匹配,从而定位出车辆在道路上的哪条车道上行驶,以及相较于车道线的具***置。
此处需要说明的是,上述模块与对应的步骤所实现的示例和应用场景相同,但不限于上述实施例1所公开的内容。需要说明的是,上述模块作为装置的一部分可以运行在如图1所示的硬件环境中,可以通过软件实现,也可以通过硬件实现,其中,硬件环境包括网络环境。
本申请还提供了一种用于实施上述车道的确定方法的服务器或终端。
图9是根据本申请实施例的一种终端的结构框图,如图9所示,该终端可以包括:一个或多个(图中仅示出一个)处理器901、存储器903、以及传输装置905(如上述实施例中的发送装置),如图9所示,该终端还可以包括输入输出设备907。
其中,存储器903可用于存储软件程序以及模块,如本申请实施 例中的方法和装置对应的程序指令/模块,处理器901通过运行存储在存储器903内的软件程序以及模块,从而执行各种功能应用以及数据处理,即实现上述的方法。存储器903可包括高速随机存储器,还可以包括非易失性存储器,如一个或者多个磁性存储装置、闪存、或者其他非易失性固态存储器。在一些实施例中,存储器903可进一步包括相对于处理器901远程设置的存储器,这些远程存储器可以通过网络连接至终端。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
上述的传输装置905用于经由一个网络接收或者发送数据,还可以用于处理器与存储器之间的数据传输。上述的网络具体实例可包括有线网络及无线网络。在一个实例中,传输装置905包括一个网络适配器(Network Interface Controller,NIC),其可通过网线与其他网络设备与路由器相连从而可与互联网或局域网进行通讯。在一个实例中,传输装置905为射频(Radio Frequency,RF)模块,其用于通过无线方式与互联网进行通讯。
其中,具体地,存储器903用于存储应用程序。
处理器901可以通过传输装置905调用存储器903存储的应用程序,以执行下述步骤:对车辆的行驶路面进行图像采集得到行驶路面的图像信息;从图像信息中识别出行驶路面的第一车道信息,其中,第一车道信息包括用于指示车辆在行驶路面的车道位置的第一车道和第一车道的特征;通过将第一车道的特征与行驶路面所在地图中的至少一个第二车道的特征进行特征匹配确定车辆在地图中的目标车道,其中,至少一个第二车道和至少一个第二车道的特征为从地图中获取的,至少一个第二车道包括目标车道。
处理器901还用于执行下述步骤:从至少一个第二车道中查找车道线特征和车道宽度与第一车道的车道线特征和车道宽度匹配的第三车道,其中,第三车道为第一车道在地图中对应的车道;确定至少 一个第二车道中与第三车道具有相对位置关系的车道为车辆在地图中的目标车道,其中,相对位置关系是车辆在行驶路面上的行驶车道与第一车道之间的位置关系。
采用本申请实施例,通过对行驶路面的图像信息进行图像处理识别出第一车道信息,然后将第一车道的特征与行驶路面所在地图中的至少一个第二车道的特征进行特征匹配确定车辆在地图中的目标车道,仅通过目前的自动行驶车辆的已有设备即可实现车辆的横向定位,确定其实际所在的车道,可以解决了车辆进行准确定位时的投入成本较高的技术问题,进而达到对车辆进行准确定位并降低成本的技术效果。
在一些实施例中,相应的具体示例可以参考上述方法实施例中所描述的示例,本实施例在此不再赘述。
本领域普通技术人员可以理解,图9所示的结构仅为示意,终端可以是智能手机(如Android手机、iOS手机等)、平板电脑、掌上电脑以及移动互联网设备(Mobile Internet Devices,MID)、PAD等终端设备。图9其并不对上述电子装置的结构造成限定。例如,终端还可包括比图9中所示更多或者更少的组件(如网络接口、显示装置等),或者具有与图9所示不同的配置。
本领域普通技术人员可以理解上述实施例的各种方法中的全部或部分步骤是可以通过程序来指令终端设备相关的硬件来完成,该程序可以存储于一计算机可读存储介质中,存储介质可以包括:闪存盘、只读存储器(Read-Only Memory,ROM)、随机存取器(Random Access Memory,RAM)、磁盘或光盘等。
本申请还提供了一种存储介质。在一些实施例中,在本实施例中,上述存储介质可以用于执行方法的程序代码。
在一些实施例中,在本实施例中,上述存储介质可以位于上述实 施例所示的网络中的多个网络设备中的至少一个网络设备上。
在一些实施例中,在本实施例中,存储介质被设置为存储用于执行以下步骤的程序代码:
S11,对车辆的行驶路面进行图像采集得到行驶路面的图像信息;
S12,从图像信息中识别出行驶路面的第一车道信息,其中,第一车道信息包括用于指示车辆在行驶路面的车道位置的第一车道和第一车道的特征;
S13,通过将第一车道的特征与行驶路面所在地图中的至少一个第二车道的特征进行特征匹配确定车辆在地图中的目标车道,其中,至少一个第二车道和至少一个第二车道的特征为从地图中获取的,至少一个第二车道包括目标车道。
在一些实施例中,存储介质还被设置为存储用于执行以下步骤的程序代码:
S21,从至少一个第二车道中查找车道线特征和车道宽度与第一车道的车道线特征和车道宽度匹配的第三车道,其中,第三车道为第一车道在地图中对应的车道;
S22,确定至少一个第二车道中与第三车道具有相对位置关系的车道为车辆在地图中的目标车道,其中,相对位置关系是车辆在行驶路面上的行驶车道与第一车道之间的位置关系。
在一些实施例中,相应的具体示例可以参考方法实施例中所描述的示例,本实施例在此不再赘述。
在一些实施例中,在本实施例中,上述存储介质可以包括但不限于:U盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、移动硬盘、磁碟或者光盘等各种可以存储程序代码的介质。
上述本申请实施例序号仅仅为了描述,不代表实施例的优劣。
上述实施例中的集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在上述计算机可读取的存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在存储介质中,包括若干指令用以使得一台或多台计算机设备(可为个人计算机、服务器或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。
在本申请实施例的上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其他实施例的相关描述。
在本申请所提供的几个实施例中,应该理解到,所揭露的客户端,可通过其它的方式实现。其中,以上所描述的装置实施例仅仅是示意性的,例如所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个***,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,单元或模块的间接耦合或通信连接,可以是电性或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
以上所述仅是本申请的优选实施方式,应当指出,对于本技术领域的普通技术人员来说,在不脱离本申请实施例原理的前提下,还可 以做出若干改进和润饰,这些改进和润饰也应视为本申请实施例的保护范围。

Claims (24)

  1. 一种车道的确定方法,应用于计算设备,所述方法包括:
    获取从车辆上采集的所述车辆的行驶路面的图像信息;
    从所述图像信息中识别出所述行驶路面的至少一个第一车道的第一车道信息,其中,所述第一车道信息包括所述车辆在所述行驶路面中所处的行驶车道与所述至少一个第一车道的位置关系,及所述至少一个第一车道的特征;
    通过将所述至少一个第一车道的特征与所述行驶路面所在地图中的至少一个第二车道的特征进行特征匹配以确定所述车辆在所述地图中的目标车道,其中,所述目标车道为所述行驶车道在所述地图中对应的车道,所述至少一个所述第二车道的特征为从所述地图中获取的。
  2. 根据权利要求1所述的方法,其中,所述至少一个第一车道的特征为所述第一车道的第一车道线特征,所述第一车道线特征包括一条或多条第一车道线的特征,所述位置关系为所述行驶车道与所述第一车道线的位置关系;
    通过将所述至少一个第一车道的特征与所述行驶路面所在地图中的至少一个第二车道的特征进行特征匹配以确定所述车辆在所述地图中的目标车道包括:
    从所述地图中获取所述车辆的地理位置所在道路的第二车道线特征,所述第二车道线特征包括一条或多条第二车道线的特征;以及
    利用所述第二车道线特征、所述第一车道线特征及所述位置关系,确定所述行驶车道在所述地图中对应的所述目标车道。
  3. 根据权利要求2所述的方法,其中,利用所述第二车道线特征、所述第一车道线特征及所述位置关系,确定所述行驶车道在所述地图中对应的所述目标车道包括:
    通过对比所述第一车道线特征与所述第二车道线特征,确定所述第二车道线中与所述第一车道线对应的第三车道线;
    根据所述第三车道线及所述位置关系,确定所述行驶车道在所述地图中对应的所述目标车道。
  4. 根据权利要求2所述的方法,其中,利用所述第二车道线特征、所述第一车道线特征及所述位置关系,确定所述行驶车道在所述地图中对应的所述目标车道包括:
    根据所述第一车道线特征及所述位置关系确定所述行驶车道的车道描述,根据所述第二车道线的特征确定所述至少一个第二车道中每个第二车道的车道描述,车道的车道描述包括一条或多条车道线的特征及该一条或多条车道线与所述车道的位置关系;
    将所述行驶车道的车道描述分别与所述至少一个第二车道中每个第二车道的车道描述进行对比,将符合所述行驶车道的车道描述的第二车道确定为所述目标车道。
  5. 根据权利要求3所述的方法,其中,所述第一车道线特征包括一条第一车道线的线条类型,所述第二车道线特征包括所述一条或多条第二车道线的线条类型;
    通过对比所述第一车道线特征与所述第二车道线特征,确定所述第二车道线中与所述第一车道线对应的第三车道线包括:将所述一条或多条第二车道线中具有所述第一车道线的线条类型的车道线作为所述第三车道线。
  6. 根据权利要求3所述的方法,其中,所述第一车道线特征包括多条第一车道线的线条类型及排列方式,所述第二车道线特征包括多条第二车道线的线条类型及排列方式;
    通过对比所述第一车道线特征与所述第二车道线特征,确定 所述第二车道线中与所述第一车道线对应的第三车道线包括:将所述多条第二车道线中具有所述多条第一车道线的线条类型及排列方式的多条车道线作为所述第三车道线。
  7. 根据权利要求6所述的方法,其中,通过对比所述第一车道线特征与所述第二车道线特征,确定所述第二车道线中与所述第一车道线对应的第三车道线包括:
    根据预设的线条类型与符号的对应关系,将所述第一车道线的线条类型对应的符号按照所述第一车道线的排列方式组织起来,生成表示所述第一车道线特征的第一符号串;
    根据所述对应关系,将所述第二车道线的线条类型对应的符号按照所述第二车道线的排列方式组织起来,生成表示所述第二车道线特征的第二符号串;
    将所述第一符号串与所述第二符号串进行对比,将第二符号串中与第一符号串相同的多个符号对应多条车道线作为所述第三车道线。
  8. 根据权利要求4所述的方法,其中,所述行驶车道的车道描述包括一条第一车道线的线条类型及所述第一车道线与所述行驶车道的位置关系;
    根据所述第二车道线的特征确定所述至少一个第二车道中每个第二车道的车道描述包括:确定所述每个第二车道的车道描述,所述车道描述包括一条第二车道线的线条类型及所述第二车道线与所述第二车道的位置关系;
    将符合所述行驶车道的车道描述的第二车道确定为所述目标车道包括:将所述至少一个第二车道中,车道描述中的线条类型及位置关系与所述行驶车道的车道描述中的线条类型及位置关系一致的车道确定为所述目标车道。
  9. 根据权利要求4所述的方法,其中,所述行驶车道的车道描述包 括多条第一车道线的线条类型、第一排列方式及所述多条第一车道线与所述行驶车道的位置关系;
    根据所述第二车道线的特征确定所述至少一个第二车道中每个第二车道的车道描述包括:确定所述每个第二车道的车道描述,所述车道描述包括多条第二车道线的线条类型、第二排列方式及所述多条第二车道线与所述第二车道的位置关系;
    将符合所述行驶车道的车道描述的第二车道确定为所述目标车道包括:将所述至少一个第二车道中,车道描述中的线条类型、排列方式及位置关系与所述行驶车道的车道描述中的所述线条类型、所述第一排列方式及所述位置关系一致的车道确定为所述目标车道。
  10. 根据权利要求9所述的方法,其中,将所述至少一个第二车道中,车道描述中的线条类型、排列方式及位置关系与所述行驶车道的车道描述中的所述线条类型、所述排列方式及所述位置关系一致的车道确定为所述目标车道包括:
    根据预设的线条类型及位置关系与符号的对应关系,将所述行驶车道的车道描述中的各第一车道线对应的符号按照所述第一排列方式组织起来,生成表示所述行驶车道线的车道描述的第一符号串;
    根据所述对应关系以及各第二车道的车道描述中的所述多条第二车道线对应的符号按照所述第二排列方式组织起来,生成表示各第二车道的车道描述的第二符号串;
    将所述第一符号串分别与各第二车道的所述第二符号串进行对比,将与第一符号串相符合的第二符号串对应的第二车道确定为所述目标车道。
  11. 根据权利要求1所述的方法,其中,从所述图像信息中识别出所述行驶路面的第一车道信息包括:
    在所述图像信息中确定所述行驶车道;
    根据确定的所述行驶车道和所述第一车道的特征,确定所述位置关系。
  12. 根据权利要求11所述的方法,其中,在所述图像信息中确定所述行驶车道包括以下中的一个:
    将所述图像信息中,图像的中线所在的车道确定为所述行驶车道;
    将所述图像信息中,车道宽度最大的车道确定为所述行驶车道;
    根据所述图像信息中车道的形状确定所述行驶车道;或
    根据所述图像信息中车道线的角度确定所述行驶车道。
  13. 根据权利要求2所述的方法,其中,从所述地图中获取所述车辆的地理位置所在道路的第二车道线特征包括:
    根据所述车辆的行驶方向,从所述地图中获取与所述行驶方向一致的道路的第二车道线特征。
  14. 一种计算设备,包括:处理器和存储器,所述存储器中存储有计算机可读指令,可以使所述处理器执行:
    获取从车辆上采集的所述车辆的行驶路面的图像信息;
    从所述图像信息中识别出所述行驶路面的至少一个第一车道的第一车道信息,其中,所述第一车道信息包括所述车辆在所述行驶路面中所处的行驶车道与所述至少一个第一车道的位置关系,及所述至少一个第一车道的特征;
    通过将所述至少一个第一车道的特征与所述行驶路面在地图中的至少一个第二车道的特征进行特征匹配以确定所述车辆在所述地图中的目标车道,其中,所述目标车道为所述行驶车道在所述地图中对应的车道,所述至少一个第二车道的特征为从所述地图中获取的。
  15. 根据权利要求14所述的计算设备,其中,所述指令可以使所述处理器:
    从所述图像信息中识别出所述第一车道信息,所述第一车道信息包括所述至少一个第一车道的第一车道线特征以及所述行驶车道与所述第一车道线的位置关系,所述第一车道线特征包括一条或多条第一车道线的特征;
    从所述地图中获取所述车辆的地理位置所在道路的第二车道线特征,所述第二车道线特征包括一条或多条第二车道线的特征;以及
    利用所述第二车道线特征、所述第一车道线特征及所述位置关系,确定所述行驶车道在所述地图中对应的所述目标车道。
  16. 根据权利要求15所述的计算设备,其中,所述指令可以使所述处理器:
    通过对比所述第一车道线特征与所述第二车道线特征,确定所述第二车道线中与所述第一车道线对应的第三车道线;
    根据所述第三车道线及所述位置关系,确定所述行驶车道在所述地图中对应的所述目标车道。
  17. 根据权利要求15所述的计算设备,其中,所述指令可以使所述处理器:
    根据所述第一车道线特征及所述位置关系确定所述行使车道的车道描述,根据所述第二车道线的特征确定所述至少一个第二车道中每个第二车道的车道描述,车道的车道描述包括一条或多条车道线的特征及该一条或多条车道线与所述车道的位置关系;
    将所述行使车道的车道描述分别与所述至少一个第二车道中每个第二车道的车道描述进行对比,将符合所述行使车道的车道描述的第二车道确定为所述目标车道。
  18. 根据权利要求16所述的计算设备,其中,所述指令可以使所述 处理器:
    当所述第一车道线特征包括一条第一车道线的线条类型,所述第二车道线特征包括所述一条或多条第二车道线的线条类型时,将所述一条或多条第二车道线中具有所述第一车道线的线条类型的车道线作为所述第三车道线。
  19. 根据权利要求16所述的计算设备,其中,所述指令可以使所述处理器:
    当所述第一车道线特征包括多条第一车道线的线条类型及排列方式,所述第二车道线特征包括多条第二车道线的线条类型及排列方式时,将所述多条第二车道线中具有所述多条第一车道线的线条类型及排列方式的多条车道线作为所述第三车道线。
  20. 根据权利要求19所述的计算设备,其中,所述指令可以使所述处理器:
    根据预设的线条类型与符号的对应关系,将所述第一车道线的线条类型对应的符号按照所述第一车道线的排列方式组织起来,生成表示所述第一车道线特征的第一符号串;
    根据所述对应关系,将所述第二车道线的线条类型对应的符号按照所述第二车道线的排列方式组织起来,生成表示所述第二车道线特征的第二符号串;
    将所述第一符号串与所述第二符号串进行对比,将第二符号串中与第一符号串相同的多个符号对应多条车道线作为所述第三车道线。
  21. 根据权利要求17所述的计算设备,其中,所述指令可以使所述处理器:
    当所述行使车道的车道特征包括一条第一车道线的线条类型及所述第一车道线与所述行使车道的位置关系时,确定所述每个第二车道的车道描述,所述车道描述包括一条第二车道线的线条类型及所述第二车道线与所述第二车道的位置关系;
    将所述至少一个第二车道中,车道描述中的线条类型及位置关系与所述行使车道的车道描述中的线条类型及位置关系一致的车道确定为所述目标车道。
  22. 根据权利要求17所述的计算设备,其中,所述指令可以使所述处理器;
    当所述行使车道的车道特征包括多条第一车道线的线条类型、第一排列方式及所述多条第一车道线与所述行使车道的位置关系时,确定所述每个第二车道的车道描述,所述车道描述包括多条第二车道线的线条类型、第二排列方式及所述多条第二车道线与所述第二车道的位置关系;
    将所述至少一个第二车道中,车道描述中的线条类型、排列方式及位置关系与所述行使车道的车道描述中的所述线条类型、所述第一排列方式及所述位置关系一致的车道确定为所述目标车道。
  23. 根据权利要求22所述的计算设备,其中,所述指令可以使所述处理器:
    根据预设的线条类型及位置关系与符号的对应关系,将所述行使车道的车道描述中的各第一车道线对应的符号按照所述第一排列方式组织起来,生成表示所述行使车道线的车道描述的第一符号串;
    根据所述对应关系以及各第二车道的车道描述中的所述多条第二车道线对应的符号按照所述第二排列方式组织起来,生成表示各第二车道的车道描述的第二符号串;
    将所述第一符号串分别与各第二车道的所述第二符号串进行对比,将与第一符号串相符合的第二符号串对应的第二车道确定为所述目标车道。
  24. 一种非易失性计算机可读存储介质,存储有计算机可读指令,可以使至少一个处理器执行如权利要求1-13任一项所述的方法。
PCT/CN2018/075052 2017-02-07 2018-02-02 车道的确定方法、装置及存储介质 WO2018145602A1 (zh)

Priority Applications (5)

Application Number Priority Date Filing Date Title
JP2019524866A JP6843990B2 (ja) 2017-02-07 2018-02-02 車道の決定方法、装置及び記憶媒体
EP18750637.3A EP3534114B1 (en) 2017-02-07 2018-02-02 Lane determination method, device and storage medium
KR1020217000555A KR102266830B1 (ko) 2017-02-07 2018-02-02 차선 결정 방법, 디바이스 및 저장 매체
KR1020197018486A KR20190090393A (ko) 2017-02-07 2018-02-02 차선 결정 방법, 디바이스 및 저장 매체
US16/439,496 US11094198B2 (en) 2017-02-07 2019-06-12 Lane determination method, device and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710073556.7A CN108303103B (zh) 2017-02-07 2017-02-07 目标车道的确定方法和装置
CN201710073556.7 2017-02-07

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/439,496 Continuation US11094198B2 (en) 2017-02-07 2019-06-12 Lane determination method, device and storage medium

Publications (1)

Publication Number Publication Date
WO2018145602A1 true WO2018145602A1 (zh) 2018-08-16

Family

ID=62872308

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/075052 WO2018145602A1 (zh) 2017-02-07 2018-02-02 车道的确定方法、装置及存储介质

Country Status (6)

Country Link
US (1) US11094198B2 (zh)
EP (1) EP3534114B1 (zh)
JP (1) JP6843990B2 (zh)
KR (2) KR102266830B1 (zh)
CN (1) CN108303103B (zh)
WO (1) WO2018145602A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020103532A1 (zh) * 2018-11-20 2020-05-28 中车株洲电力机车有限公司 一种多轴电客车自导向方法
CN112489495A (zh) * 2020-10-26 2021-03-12 浙江吉利控股集团有限公司 一种车辆预警方法、装置、电子设备及存储介质
EP3989200A4 (en) * 2019-06-19 2022-06-22 Mitsubishi Electric Corporation DEVICE, METHOD AND PROGRAM FOR DETERMINING RELATIVE POSITION
CN115265493A (zh) * 2022-09-26 2022-11-01 四川省公路规划勘察设计研究院有限公司 一种基于非标定相机的车道级定位方法及装置

Families Citing this family (79)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3400556A1 (en) * 2016-01-05 2018-11-14 Mobileye Vision Technologies Ltd. Systems and methods for estimating future paths
EP3428577A1 (en) * 2017-07-12 2019-01-16 Veoneer Sweden AB A driver assistance system and method
EP3457244B1 (de) * 2017-09-14 2019-10-30 Sick Ag Autonomes fahrzeug und markierungsanordnung für ein autonomes fahrzeug
US11216004B2 (en) * 2017-11-07 2022-01-04 Uatc, Llc Map automation—lane classification
CN109297500B (zh) * 2018-09-03 2020-12-15 武汉中海庭数据技术有限公司 基于车道线特征匹配的高精度定位装置及方法
CN109186615A (zh) * 2018-09-03 2019-01-11 武汉中海庭数据技术有限公司 基于高精度地图的车道边线距离检测方法、装置及存储介质
CN109186616B (zh) * 2018-09-20 2020-04-07 禾多科技(北京)有限公司 基于高精度地图和场景检索的车道线辅助定位方法
CN110969178B (zh) * 2018-09-30 2023-09-12 毫末智行科技有限公司 自动驾驶车辆的数据融合***、方法及自动驾驶***
CN110967026B (zh) * 2018-09-30 2022-02-22 毫末智行科技有限公司 车道线拟合方法及***
CN110969837B (zh) * 2018-09-30 2022-03-25 毫末智行科技有限公司 自动驾驶车辆的道路信息融合***及方法
CN111046709B (zh) * 2018-10-15 2021-02-09 广州汽车集团股份有限公司 车辆车道级定位方法、***、车辆及存储介质
EP3640679B1 (en) * 2018-10-15 2023-06-07 Zenuity AB A method for assigning ego vehicle to a lane
KR102483649B1 (ko) * 2018-10-16 2023-01-02 삼성전자주식회사 차량 위치 결정 방법 및 차량 위치 결정 장치
DE102018218043A1 (de) 2018-10-22 2020-04-23 Robert Bosch Gmbh Ermittlung einer Anzahl von Fahrspuren sowie von Spurmarkierungen auf Straßenabschnitten
CN110146096B (zh) * 2018-10-24 2021-07-20 北京初速度科技有限公司 一种基于图像感知的车辆定位方法及装置
CN109671432A (zh) * 2018-12-25 2019-04-23 斑马网络技术有限公司 语音定位处理方法、装置、定位设备及车辆
CN111380546A (zh) * 2018-12-28 2020-07-07 沈阳美行科技有限公司 基于平行道路的车辆定位方法、装置、电子设备和介质
CN111380538B (zh) * 2018-12-28 2023-01-24 沈阳美行科技股份有限公司 一种车辆定位方法、导航方法及相关装置
CN111382614B (zh) * 2018-12-28 2023-08-15 沈阳美行科技股份有限公司 车辆定位方法、装置、电子设备、计算机可读存储介质
CN109657641B (zh) * 2018-12-29 2021-02-02 北京经纬恒润科技股份有限公司 一种车辆主辅路判断方法及装置
CN110614995B (zh) * 2018-12-29 2021-01-22 长城汽车股份有限公司 车辆自动驾驶时的行车道选择方法、选择***及车辆
CN109948413B (zh) * 2018-12-29 2021-06-04 禾多科技(北京)有限公司 基于高精度地图融合的车道线检测方法
CN109870689B (zh) * 2019-01-08 2021-06-04 武汉中海庭数据技术有限公司 毫米波雷达与高精矢量地图匹配的车道级定位方法与***
CN111507129B (zh) * 2019-01-31 2023-08-22 广州汽车集团股份有限公司 车道级定位方法及***、计算机设备、车辆、存储介质
CN111507130B (zh) * 2019-01-31 2023-08-18 广州汽车集团股份有限公司 车道级定位方法及***、计算机设备、车辆、存储介质
CN109870706A (zh) * 2019-01-31 2019-06-11 深兰科技(上海)有限公司 一种路面标识的检测方法、装置、设备及介质
CN111521192A (zh) * 2019-02-01 2020-08-11 阿里巴巴集团控股有限公司 定位方法、导航信息显示方法、定位***及电子设备
CN109816981B (zh) * 2019-02-20 2021-03-19 百度在线网络技术(北京)有限公司 一种自动驾驶方法、装置及存储介质
CN111750878B (zh) * 2019-03-28 2022-06-24 北京魔门塔科技有限公司 一种车辆位姿的修正方法和装置
CN111750882B (zh) * 2019-03-29 2022-05-27 北京魔门塔科技有限公司 一种导航地图在初始化时车辆位姿的修正方法和装置
CN110954112B (zh) * 2019-03-29 2021-09-21 北京初速度科技有限公司 一种导航地图与感知图像匹配关系的更新方法和装置
CN111854727B (zh) * 2019-04-27 2022-05-13 北京魔门塔科技有限公司 一种车辆位姿的修正方法和装置
CN110174113B (zh) * 2019-04-28 2023-05-16 福瑞泰克智能***有限公司 一种车辆行驶车道的定位方法、装置及终端
CN110544375A (zh) * 2019-06-10 2019-12-06 河南北斗卫星导航平台有限公司 一种车辆监管方法、装置及计算机可读存储介质
CN110160540B (zh) * 2019-06-12 2020-12-18 禾多科技(北京)有限公司 基于高精度地图的车道线数据融合方法
CN112172810A (zh) * 2019-06-18 2021-01-05 广州汽车集团股份有限公司 车道保持装置、方法、***及汽车
CN110415541A (zh) * 2019-07-19 2019-11-05 重庆长安汽车股份有限公司 一种路口通行状态提示方法及***
CN110763246A (zh) * 2019-08-06 2020-02-07 中国第一汽车股份有限公司 自动驾驶车辆路径规划方法、装置、车辆及存储介质
CN110455298B (zh) * 2019-08-14 2022-02-08 灵动科技(北京)有限公司 车辆用定位方法及定位***
CN110516652B (zh) * 2019-08-30 2023-04-18 北京百度网讯科技有限公司 车道检测的方法、装置、电子设备及存储介质
CN112461257A (zh) * 2019-09-09 2021-03-09 华为技术有限公司 一种车道线信息的确定方法及装置
CN112578404B (zh) * 2019-09-27 2022-10-04 北京地平线机器人技术研发有限公司 一种行驶路径的确定方法及装置
CN110779535B (zh) * 2019-11-04 2023-03-03 腾讯科技(深圳)有限公司 一种获得地图数据及地图的方法、装置和存储介质
CN111044035B (zh) * 2019-12-11 2022-06-17 斑马网络技术有限公司 车辆定位方法及装置
US11860634B2 (en) * 2019-12-12 2024-01-02 Baidu Usa Llc Lane-attention: predicting vehicles' moving trajectories by learning their attention over lanes
CN113034587B (zh) * 2019-12-25 2023-06-16 沈阳美行科技股份有限公司 车辆定位方法、装置、计算机设备和存储介质
CN110967035B (zh) * 2020-02-28 2020-09-04 杭州云动智能汽车技术有限公司 一种提高车载v2x车道匹配度方法
CN111341150B (zh) * 2020-02-28 2021-01-26 长安大学 用于避免超高车辆驶入限高路段的提醒方法与装置
CN111369819B (zh) * 2020-03-02 2021-12-14 腾讯科技(深圳)有限公司 一种行驶对象的选择方法及装置
CN111524351A (zh) * 2020-04-22 2020-08-11 东风汽车集团有限公司 一种匝道限速识别方法
CN111523471B (zh) * 2020-04-23 2023-08-04 阿波罗智联(北京)科技有限公司 车辆所在车道的确定方法、装置、设备以及存储介质
CN113551680A (zh) * 2020-04-23 2021-10-26 上汽通用汽车有限公司 一种车道级定位***和方法
DE102020206221A1 (de) * 2020-05-18 2021-11-18 Zf Friedrichshafen Ag Verfahren zum Erfassen von Fahrbahnbelastungen
CN113701763B (zh) * 2020-05-20 2024-06-21 百度在线网络技术(北京)有限公司 用于生成信息的方法和装置
CN111932887B (zh) * 2020-08-17 2022-04-26 武汉四维图新科技有限公司 车道级轨迹数据的生成方法及设备
CN112216102B (zh) * 2020-09-11 2021-09-14 恒大新能源汽车投资控股集团有限公司 路面信息的确定方法、装置、设备及存储介质
CN112363192A (zh) * 2020-09-29 2021-02-12 蘑菇车联信息科技有限公司 车道定位方法、装置、车辆、电子设备及存储介质
CN112284416B (zh) * 2020-10-19 2022-07-29 武汉中海庭数据技术有限公司 一种自动驾驶定位信息校准装置、方法及存储介质
CN112541437A (zh) * 2020-12-15 2021-03-23 北京百度网讯科技有限公司 车辆定位方法、装置、电子设备及存储介质
CN112560680A (zh) * 2020-12-16 2021-03-26 北京百度网讯科技有限公司 车道线处理方法、装置、电子设备及存储介质
CN112839855B (zh) * 2020-12-31 2022-07-12 华为技术有限公司 一种轨迹预测方法与装置
US11951992B2 (en) * 2021-01-05 2024-04-09 Guangzhou Automobile Group Co., Ltd. Vehicle positioning method and apparatus, storage medium, and electronic device
CN112721926B (zh) * 2021-02-25 2023-05-09 深圳市科莱德电子有限公司 基于区块链的自动驾驶汽车车道保持控制方法、***
KR102472569B1 (ko) * 2021-03-12 2022-11-30 포티투닷 주식회사 차량의 현재 차선을 결정하기 위한 방법 및 장치
CN113269976B (zh) * 2021-03-30 2022-08-23 荣耀终端有限公司 定位方法和装置
CN113380048B (zh) * 2021-06-25 2022-09-02 中科路恒工程设计有限公司 基于神经网络的高危路段车辆驾驶行为识别方法
CN113449629B (zh) * 2021-06-25 2022-10-28 重庆卡佐科技有限公司 基于行车视频的车道线虚实识别装置、方法、设备及介质
KR102438114B1 (ko) * 2021-07-28 2022-08-31 포티투닷 주식회사 차량의 주행 경로를 결정하기 위한 방법 및 장치
CN113386771A (zh) * 2021-07-30 2021-09-14 蔚来汽车科技(安徽)有限公司 道路模型的生成方法及设备
CN113945221B (zh) * 2021-09-26 2024-02-13 华中科技大学 一种考虑近迫感效应的自动驾驶车道宽度确定方法
CN113901342B (zh) * 2021-09-30 2022-07-26 北京百度网讯科技有限公司 一种路网数据处理方法、装置、电子设备及存储介质
CN114396958B (zh) * 2022-02-28 2023-08-18 重庆长安汽车股份有限公司 基于多车道多传感器的车道定位方法、***及车辆
CN114396959B (zh) * 2022-03-25 2022-08-30 华砺智行(武汉)科技有限公司 基于高精度地图的车道匹配定位方法、装置、设备及介质
CN114820671A (zh) * 2022-04-11 2022-07-29 苏州大学 一种用于货运铁路无人驾驶的轨道限界识别方法
US11624616B1 (en) * 2022-06-20 2023-04-11 Plusai, Inc. Multi-mode visual geometry localization
US11562501B1 (en) 2022-06-20 2023-01-24 Plusai, Inc. Multi-mode visual geometry localization
CN115597593A (zh) * 2022-09-22 2023-01-13 长沙谱蓝网络科技有限公司(Cn) 基于高精地图的实时导航方法及装置
KR20240053979A (ko) * 2022-10-18 2024-04-25 네이버랩스 주식회사 그라운드 컨트롤 라인 자동 취득 방법 및 시스템
CN115352455B (zh) * 2022-10-19 2023-01-17 福思(杭州)智能科技有限公司 道路特征的预测方法和装置、存储介质及电子装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060233424A1 (en) * 2005-01-28 2006-10-19 Aisin Aw Co., Ltd. Vehicle position recognizing device and vehicle position recognizing method
CN101346602A (zh) * 2006-05-16 2009-01-14 丰田自动车株式会社 车辆用定位信息更新装置
CN105679079A (zh) * 2016-02-26 2016-06-15 同济大学 一种面向车路网协同的车载机***
CN106352867A (zh) * 2015-07-16 2017-01-25 福特全球技术公司 用于确定车辆自身位置的方法和设备

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4724043B2 (ja) * 2006-05-17 2011-07-13 トヨタ自動車株式会社 対象物認識装置
JP4680131B2 (ja) * 2006-05-29 2011-05-11 トヨタ自動車株式会社 自車位置測定装置
JP4861850B2 (ja) * 2007-02-13 2012-01-25 アイシン・エィ・ダブリュ株式会社 レーン判定装置及びレーン判定方法
CN101275854A (zh) * 2007-03-26 2008-10-01 日电(中国)有限公司 更新地图数据的方法和设备
JP4886597B2 (ja) * 2007-05-25 2012-02-29 アイシン・エィ・ダブリュ株式会社 レーン判定装置及びレーン判定方法、並びにそれを用いたナビゲーション装置
JP4506790B2 (ja) * 2007-07-05 2010-07-21 アイシン・エィ・ダブリュ株式会社 道路情報生成装置、道路情報生成方法および道路情報生成プログラム
JP4936070B2 (ja) * 2007-12-28 2012-05-23 アイシン・エィ・ダブリュ株式会社 ナビゲーション装置及びナビゲーションプログラム
JP4985431B2 (ja) * 2008-01-28 2012-07-25 アイシン・エィ・ダブリュ株式会社 道路走行予想軌跡導出装置、道路走行予想軌跡導出方法および道路走行予想軌跡導出プログラム
US8725413B2 (en) * 2012-06-29 2014-05-13 Southwest Research Institute Location and motion estimation using ground imaging sensor
KR101209062B1 (ko) * 2012-07-24 2012-12-06 주식회사 피엘케이 테크놀로지 영상인식 정보를 이용한 gps 보정 시스템 및 방법
KR101986166B1 (ko) * 2012-08-09 2019-06-07 현대모비스 주식회사 차선 인식 장치 및 방법
TWI595450B (zh) * 2014-04-01 2017-08-11 能晶科技股份有限公司 物件偵測系統
CN103954275B (zh) * 2014-04-01 2017-02-08 西安交通大学 基于车道线检测和gis地图信息开发的视觉导航方法
WO2016051228A1 (en) * 2014-09-30 2016-04-07 Umm-Al-Qura University A method and system for an accurate and energy efficient vehicle lane detection
US9459626B2 (en) * 2014-12-11 2016-10-04 Here Global B.V. Learning signs from vehicle probes
GB2542115B (en) * 2015-09-03 2017-11-15 Rail Vision Europe Ltd Rail track asset survey system
CN105260713B (zh) 2015-10-09 2019-06-28 东方网力科技股份有限公司 一种车道线检测方法和装置
CN105608429B (zh) 2015-12-21 2019-05-14 重庆大学 基于差分激励的鲁棒车道线检测方法
US9928424B2 (en) * 2016-02-22 2018-03-27 Conduent Business Services, Llc Side window detection through use of spatial probability maps
KR101851155B1 (ko) * 2016-10-12 2018-06-04 현대자동차주식회사 자율 주행 제어 장치, 그를 가지는 차량 및 그 제어 방법
US10210403B2 (en) * 2017-04-24 2019-02-19 Here Global B.V. Method and apparatus for pixel based lane prediction
US10579067B2 (en) * 2017-07-20 2020-03-03 Huawei Technologies Co., Ltd. Method and system for vehicle localization
US10210756B2 (en) * 2017-07-24 2019-02-19 Harman International Industries, Incorporated Emergency vehicle alert system
US10871377B1 (en) * 2019-08-08 2020-12-22 Phiar Technologies, Inc. Computer-vision based positioning for augmented reality navigation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060233424A1 (en) * 2005-01-28 2006-10-19 Aisin Aw Co., Ltd. Vehicle position recognizing device and vehicle position recognizing method
CN101346602A (zh) * 2006-05-16 2009-01-14 丰田自动车株式会社 车辆用定位信息更新装置
CN106352867A (zh) * 2015-07-16 2017-01-25 福特全球技术公司 用于确定车辆自身位置的方法和设备
CN105679079A (zh) * 2016-02-26 2016-06-15 同济大学 一种面向车路网协同的车载机***

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3534114A4

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020103532A1 (zh) * 2018-11-20 2020-05-28 中车株洲电力机车有限公司 一种多轴电客车自导向方法
EP3989200A4 (en) * 2019-06-19 2022-06-22 Mitsubishi Electric Corporation DEVICE, METHOD AND PROGRAM FOR DETERMINING RELATIVE POSITION
CN112489495A (zh) * 2020-10-26 2021-03-12 浙江吉利控股集团有限公司 一种车辆预警方法、装置、电子设备及存储介质
CN112489495B (zh) * 2020-10-26 2023-01-17 浙江吉利控股集团有限公司 一种车辆预警方法、装置、电子设备及存储介质
CN115265493A (zh) * 2022-09-26 2022-11-01 四川省公路规划勘察设计研究院有限公司 一种基于非标定相机的车道级定位方法及装置

Also Published As

Publication number Publication date
US11094198B2 (en) 2021-08-17
JP6843990B2 (ja) 2021-03-17
KR102266830B1 (ko) 2021-06-18
EP3534114A4 (en) 2020-06-24
KR20190090393A (ko) 2019-08-01
KR20210006511A (ko) 2021-01-18
JP2020518785A (ja) 2020-06-25
EP3534114A1 (en) 2019-09-04
CN108303103B (zh) 2020-02-07
US20190295420A1 (en) 2019-09-26
EP3534114B1 (en) 2021-12-29
CN108303103A (zh) 2018-07-20

Similar Documents

Publication Publication Date Title
WO2018145602A1 (zh) 车道的确定方法、装置及存储介质
CN110009913B (zh) 一种闯红灯车辆的非现场执法图片智能审核方法和***
CN110501018B (zh) 一种服务于高精度地图生产的交通标志牌信息采集方法
US10235580B2 (en) Method and system for automatic detection of parking zones
WO2018068653A1 (zh) 点云数据处理方法、装置及存储介质
CN108416808B (zh) 车辆重定位的方法及装置
CN112991791B (zh) 交通信息识别和智能行驶方法、装置、设备及存储介质
CN111830953A (zh) 车辆自定位方法、装置及***
CN110348463B (zh) 用于识别车辆的方法和装置
CN114663852A (zh) 车道线图的构建方法、装置、电子设备及可读存储介质
CN111316324A (zh) 一种自动驾驶模拟***、方法、设备及存储介质
CN110765224A (zh) 电子地图的处理方法、车辆视觉重定位的方法和车载设备
Xiao et al. Geo-spatial aerial video processing for scene understanding and object tracking
CN115564865A (zh) 一种众包高精地图的构建方法、***、电子设备及车辆
CN202350794U (zh) 一种导航数据采集装置
CN111126154A (zh) 路面元素的识别方法、装置、无人驾驶设备和存储介质
JP2012099010A (ja) 画像処理装置及び画像処理プログラム
CN115790568A (zh) 基于语义信息的地图生成方法及相关设备
Imad et al. Navigation system for autonomous vehicle: A survey
CN103903269A (zh) 球机监控视频的结构化描述方法和***
KR102608167B1 (ko) 내비게이션 표적의 마킹 방법 및 장치, 전자 기기, 컴퓨터 판독 가능 매체
CN116311095B (zh) 基于区域划分的路面检测方法、计算机设备及存储介质
CN111612854B (zh) 实景地图的生成方法、装置、计算机设备和存储介质
CN117274402B (zh) 相机外参的标定方法、装置、计算机设备及存储介质
US20240193964A1 (en) Lane line recognition method, electronic device and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18750637

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019524866

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2018750637

Country of ref document: EP

Effective date: 20190531

ENP Entry into the national phase

Ref document number: 20197018486

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE