CN113320546A - Shadow-based vehicle positioning method and control device, storage medium and vehicle - Google Patents

Shadow-based vehicle positioning method and control device, storage medium and vehicle Download PDF

Info

Publication number
CN113320546A
CN113320546A CN202110748002.9A CN202110748002A CN113320546A CN 113320546 A CN113320546 A CN 113320546A CN 202110748002 A CN202110748002 A CN 202110748002A CN 113320546 A CN113320546 A CN 113320546A
Authority
CN
China
Prior art keywords
shadow
street lamp
vehicle
determining
current vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110748002.9A
Other languages
Chinese (zh)
Inventor
李阳
杜思军
高雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Evergrande New Energy Automobile Investment Holding Group Co Ltd
Original Assignee
Evergrande New Energy Automobile Investment Holding Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Evergrande New Energy Automobile Investment Holding Group Co Ltd filed Critical Evergrande New Energy Automobile Investment Holding Group Co Ltd
Priority to CN202110748002.9A priority Critical patent/CN113320546A/en
Publication of CN113320546A publication Critical patent/CN113320546A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0043Signal treatments, identification of variables or parameters, parameter estimation or state estimation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/20Static objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the invention provides a shadow-based vehicle positioning method, and belongs to the technical field of automatic driving. The method comprises the following steps: acquiring shadow images of the road around the current vehicle on a lane with a street lamp; correspondingly acquiring the height and the absolute position of a reference street lamp according to the acquired positioning information, wherein the reference street lamp is a street lamp which enables the current vehicle to form the shadow image; determining the relative position of the vehicle relative to the reference street lamp according to the actual shadow boundary of the obtained shadow image of the road surface around the vehicle and the height of the reference street lamp, wherein the actual shadow boundary is a bright-dark boundary in the shadow image; and determining the absolute positioning of the vehicle according to the absolute position of the reference street lamp and the relative position of the vehicle relative to the reference street lamp. The embodiment of the invention combines the special street lamp and shadow information under the night road condition (or the road condition with poor illumination condition) to perform high-precision positioning on the vehicle so as to assist automatic driving.

Description

Shadow-based vehicle positioning method and control device, storage medium and vehicle
Technical Field
The application relates to the technical field of automatic driving, in particular to a shadow-based vehicle positioning method, a shadow-based vehicle positioning control device, a storage medium and a vehicle.
Background
The positioning of the vehicle is one of the technical difficulties as an important part of automatic driving, and the current positioning of the vehicle has two solutions:
1) using gnss to obtain absolute positioning coordinates;
2) obtaining relative positioning information with respect to a lane line, traffic sign, or other visual marker using a vision-based method;
3) fusing position information of lane lines, traffic signs or other visual signs in the high-precision map to obtain absolute positioning coordinates based on vision;
4) calculating the current absolute positioning coordinate by using the IMU, the wheel speed meter, the steering information and the like and combining with the gnss absolute coordinate;
5) and (3) fusing gnss to determine positioning coordinates by using methods such as Kalman filtering and the like, determining the positioning coordinates based on vision and determining the positioning coordinates based on IMU, wheel speed and steering calculation to obtain final positioning output.
The above solutions can be summarized in the following two broad categories, all with certain drawbacks:
1) the positioning method based on gnss has the advantages that the general positioning accuracy is in the meter level, and the positioning requirement of automatic driving cannot be met;
2) on the basis of gnss positioning, a vision-based positioning method is generally used for improving positioning accuracy, so that the accuracy requirement of automatic driving is met, for example, lane lines or traffic signs are used for relative positioning, but the vision-based method has higher requirements on the lane lines or illumination conditions, and the performance is influenced under the conditions that the lane lines are unclear or at night.
Disclosure of Invention
The embodiment of the invention aims to provide a shadow-based vehicle positioning method, which is used for solving the problems that the existing automatic driving vehicle positioning has higher requirements on lane lines or illumination conditions, and the performance is influenced under the conditions that the lane lines are unclear or the night environment.
In order to achieve the above object, an embodiment of the present invention provides a shadow-based vehicle positioning method, including: acquiring shadow images of the road around the current vehicle on a lane with a street lamp; correspondingly acquiring the height and the absolute position of a reference street lamp according to the acquired positioning information, wherein the reference street lamp is a street lamp which enables the current vehicle to form the shadow image; determining the relative position of the vehicle relative to the reference street lamp according to the actual shadow boundary of the obtained shadow image of the road surface around the vehicle and the height of the reference street lamp, wherein the actual shadow boundary is a bright-dark boundary in the shadow image; and determining the absolute positioning of the vehicle according to the absolute position of the reference street lamp and the relative position of the vehicle relative to the reference street lamp.
Optionally, the obtaining the height and the absolute position of the reference street lamp according to the positioning information of the current vehicle includes: acquiring a map containing the height and absolute position of each street lamp on a lane; according to the positioning information of the current vehicle, determining the street lamp closest to the current vehicle position in the map as the reference street lamp; and acquiring the height and the absolute position of the reference street lamp from the map.
Optionally, the determining the relative position of the current vehicle with respect to the reference street lamp includes: according to grid searching and the three-dimensional model of the current vehicle, determining corresponding reference shadow boundary lines of the current vehicle at each grid position in the action range of the reference street lamp; determining the reference shadow boundary corresponding to the actual shadow boundary, and determining the grid where the current vehicle is located according to the grid corresponding to the determined reference shadow boundary; and determining the relative position of the current vehicle relative to the reference street lamp according to the grid where the reference street lamp is located and the grid where the current vehicle is located.
Optionally, the determining the reference shadow boundary corresponding to the actual shadow boundary includes: intercepting comparison areas with preset widths at two sides of any reference shadow boundary; respectively calculating the brightness of a bright part and the brightness of a shadow part in the contrast area; and calculating a difference between the brightness of the bright portion and the brightness of the shadow portion, and determining a reference shadow boundary, in which the difference exceeds a preset threshold, as the reference shadow boundary corresponding to the actual shadow boundary.
Optionally, the preset threshold is represented by the following formula:
Figure BDA0003145033290000031
wherein b represents a bright portion preset value, d represents a shadow portion preset value, r represents a distance from the center of the road surface image around the vehicle to the street lamp, and Z represents the preset threshold value.
Optionally, when there are a plurality of reference street lamps, the determining the relative position of the current vehicle with respect to the reference street lamp includes: determining a group of reference shadow boundary lines corresponding to the current vehicle at each grid position in the action range of a plurality of reference street lamps according to grid search and the three-dimensional model of the current vehicle; determining the set of reference shadow boundary lines corresponding to the actual shadow boundary lines, and determining the grid where the current vehicle is located according to the grid corresponding to the set of reference shadow boundary lines; and determining the relative position of the vehicle relative to the reference street lamp according to the grid where the reference street lamp is located and the grid where the current vehicle is located.
Optionally, after obtaining the grid where the set of reference shadow boundaries exists when the vehicle is at each grid position, the shadow-based vehicle positioning method further includes: for the corresponding group of reference shadow boundaries, intercepting comparison areas with preset widths at two sides of each reference shadow boundary in each group, wherein the group of reference shadow boundaries are crossed to generate a plurality of comparison areas; respectively calculating the brightness of the bright part of each contrast area; and calculating brightness difference values of bright parts of adjacent contrast areas, respectively calculating differences between the difference values and a preset threshold, and determining that the corresponding group of reference shadow boundary lines correspond to the actual shadow boundary line if weighted average of the differences between the respectively calculated difference values and the preset threshold is smaller than the preset weighted threshold.
The embodiment of the invention also provides a shadow-based vehicle positioning control device, which comprises: a memory, a processor and a computer program stored on the memory and executable on the processor, the processor executing the computer program to implement the shadow based vehicle localization method of any one of the above.
Embodiments of the present invention also provide a machine-readable storage medium having instructions stored thereon, the instructions causing a machine to perform the shadow-based vehicle localization method according to any one of the above.
The embodiment of the invention also provides a vehicle which comprises the shadow-based vehicle positioning control device.
According to the technical scheme, the relative position of the vehicle relative to the reference street lamp is determined according to the obtained actual shadow boundary of the road surface image around the vehicle and the height of the reference street lamp, the absolute positioning of the vehicle is determined according to the absolute position of the reference street lamp and the relative position of the vehicle relative to the reference street lamp, and the high-precision positioning of the vehicle is carried out by combining the special street lamp and shadow information under night road conditions (or road conditions with poor illumination conditions, such as cloudy road conditions or tunnels) so as to assist automatic driving.
Additional features and advantages of embodiments of the invention will be set forth in the detailed description which follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the embodiments of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the embodiments of the invention without limiting the embodiments of the invention. In the drawings:
FIG. 1 is a schematic flow chart diagram of a shadow-based vehicle localization method provided by an embodiment of the present invention;
FIG. 2 is a schematic diagram of a configuration for determining a reference position of a vehicle relative to a reference street light;
FIG. 3 is a schematic diagram of the structure of a shadow boundary generating contrast region;
FIG. 4 is a schematic diagram of the structure of multiple reference shadow boundaries.
Description of the reference numerals
11 reference street lamp 12 vehicle
13 contrasting area with reference to a shadow boundary 14
21 reference shadow boundary 122 reference shadow boundary 2
Detailed Description
The following detailed description of embodiments of the invention refers to the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating embodiments of the invention, are given by way of illustration and explanation only, not limitation.
Fig. 1 is a schematic flow chart of a shadow-based vehicle positioning method according to an embodiment of the present invention, and referring to fig. 1, the shadow-based vehicle positioning method may include the following steps:
step S110: and acquiring a shadow image of the road surface around the current vehicle in the lane with the street lamp.
By way of example, an image of a road surface around a vehicle may be acquired by a look-around camera, and a shadow of the vehicle under illumination of a street lamp is extracted from the image of the road surface around the vehicle by an image extraction technology, where the look-around camera of the current vehicle is generally at the bottom of the vehicle, and may acquire a complete shadow image of the road surface around the current vehicle, or may acquire a partial shadow image.
Step S120: and correspondingly acquiring the height and the absolute position of a reference street lamp according to the acquired positioning information, wherein the reference street lamp is a street lamp which enables the current vehicle to form the shadow image.
Preferably, the acquiring the height and the absolute position of the reference street lamp according to the positioning information of the current vehicle includes: acquiring a map containing the height and absolute position of each street lamp on a lane; according to the positioning information of the current vehicle, determining the street lamp closest to the current vehicle position in the map as the reference street lamp; and acquiring the height and the absolute position of the reference street lamp from the map.
By way of example, the height of the reference street light may be obtained in conjunction with a high-precision map. The position information of the current vehicle can be acquired in combination with a high-precision map and vehicle positioning (e.g., gnss positioning or GPS). The method comprises the steps that the absolute position and the height of each street lamp on each road are already arranged in a high-precision map, the positioning information of a vehicle is obtained through a positioning system of the vehicle, then the positioning information of the vehicle is combined with the high-precision map, the position of the vehicle corresponding to the high-precision map is determined, and the absolute position of a reference street lamp is determined according to the position of the vehicle corresponding to the high-precision map and the absolute position of each street lamp in the high-precision map. Because the requirement of the gnss and the auxiliary equipment thereof on the illumination condition is high, the position information of the vehicle obtained at the time may not be an accurate position, namely, which street lamps the vehicle is in the vicinity of.
Step S130: and determining the relative position of the vehicle relative to the reference street lamp according to the acquired actual shadow boundary of the shadow image of the road surface around the vehicle and the height of the reference street lamp, wherein the actual shadow boundary is a bright-dark boundary in the shadow image.
As an embodiment of the present application, determining the relative position of the vehicle with respect to the reference street lamp may include the following steps S131 to S133:
step S131: and determining a corresponding reference shadow boundary of the current vehicle at each grid position in the action range of the reference street lamp according to the grid search and the three-dimensional model of the current vehicle.
By way of example, referring to fig. 2, according to the absolute position of the reference street lamp 11, a grid where the reference street lamp 11 is located may be marked on the map, and it should be noted that the grid is preferably a grid in the vehicle coordinate system. Combining the three-dimensional model of the vehicle 12, within the action range of the reference street lamp 11, simulating the corresponding reference shadow boundary 13 of the vehicle 11 at each grid position, for example, calculating an image seen by one of the look-around cameras according to the installation position and parameters of the look-around camera on the vehicle, and obtaining the reference shadow boundary 13 through image processing. The reference shadow boundary 13 may be calculated and stored in the memory unit of the controller, or may be calculated in real time during the search.
It should be noted that the shadow boundary in fig. 2 is for illustration, and therefore, the shadow boundary is represented by a straight line, and in actual cases, the shadow boundary may be represented by a series of pixel points (coordinates) to represent the corresponding boundary. The shadow boundaries in the following are also the same and will not be described in detail.
Optionally, referring to fig. 3, after obtaining the grid where the corresponding reference shadow boundary 13 exists when the vehicle is at each grid position, the shadow-based vehicle positioning method may further include: for the corresponding reference shadow boundary line 13, cutting contrast areas 14 with preset widths at two sides; calculating the brightness of the bright part and the brightness of the shadow part in the contrast area 14 respectively; and calculating a difference between the brightness of the bright portion and the brightness of the shadow portion, and determining that the corresponding reference shadow boundary corresponds to the actual shadow boundary when the difference exceeds a preset threshold.
For example, a contrast area 14 of a preset width is cut on both sides of the corresponding reference shadow boundary 13, the brightness of the command portion and the shadow portion of the contrast area 14 is calculated separately, and for example, the acquired road surface image around the vehicle may be converted into an HSB space, and then the average value of the B-channel values of the pixels may be calculated in the contrast area 14 of the preset width.
Further, the difference between the brightness of the bright portion and the brightness of the shadow portion is calculated, and if the preset threshold value Z is exceeded, it can be determined that the reference shadow boundary 13 can correspond to the actual shadow boundary in the acquired road surface image around the vehicle.
Wherein the preset threshold value Z may be represented by the following formula:
Figure BDA0003145033290000071
wherein b denotes a bright portion preset value, d denotes a shadow portion preset value, and b and dIt can be obtained by calibration and,
Figure BDA0003145033290000072
which represents the brightness of the bright part or parts,
Figure BDA0003145033290000073
the brightness of the shaded portion is indicated, and r is the distance from the center of the road surface image around the vehicle to the street lamp.
Step S12: and determining the reference shadow boundary corresponding to the actual shadow boundary, and determining the grid where the current vehicle is located according to the grid corresponding to the determined reference shadow boundary.
For example, if a reference shadow boundary 13 corresponds to an actual shadow boundary in the acquired image of the road surface around the vehicle, the grid where the vehicle is located may be obtained, for example, the actual shadow boundary 13 may be obtained by image processing based on an image actually captured by a certain panoramic camera on the vehicle, and when the actual shadow boundary 13 corresponds to a certain reference shadow boundary 14, the location where the vehicle is located may be searched. The correspondence means that the shape and position of the boundary are substantially uniform, and may have a small error.
Step S13: and determining the relative position of the current vehicle relative to the reference street lamp according to the grid where the reference street lamp is located and the grid where the current vehicle is located.
Preferably, when there are a plurality of reference street lamps 11, determining the relative position of the vehicle with respect to the reference street lamps may further include the following steps S231 to S233:
step S21: and determining a group of reference shadow boundary lines corresponding to the current vehicle at each grid position in the action range of the reference street lamps according to the grid search and the three-dimensional model of the current vehicle.
For example, referring to fig. 4, similar to step S12, a grid where a plurality of reference street lamps are located, for example, two adjacent reference street lamps, may be marked on the map, and when a vehicle is within the coverage of the plurality of reference street lamps, there are a plurality of shadow boundaries, for example, a reference shadow boundary 1 and a reference shadow boundary 2, and the two reference shadow boundaries may be regarded as a group, that is, the two (a plurality of) shadow boundaries may uniquely determine the grid where the vehicle is located. The corresponding reference shaded demarcation 1 and reference shaded demarcation 2 for each grid location of the simulated vehicle. Each set of reference shadow boundaries can be calculated and stored in the memory unit of the controller, or can be calculated in real time during searching.
It should be noted that, with reference to fig. 4, the embodiment of the present invention is exemplified by using two shadow boundaries, and the reference street lamps and the reference shadow boundaries are similar to two shadow boundaries. In fig. 4, the shadow boundaries are illustrated schematically, and are represented by straight lines, but in actual practice, the shadow boundaries may be represented by a series of pixels (positions of the grids) to represent corresponding boundaries. The shadow boundaries in the following are also the same and will not be described in detail.
Preferably, after obtaining the grid where the set of reference shadow boundaries exist when the vehicle is at each grid position, the shadow-based vehicle positioning method further includes: for the corresponding group of reference shadow boundaries, intercepting comparison areas with preset widths at two sides of each reference shadow boundary in each group, wherein the reference shadow boundaries in one group are intersected to generate a plurality of comparison areas; respectively calculating the brightness of the bright part of each contrast area; and calculating the difference value of the brightness of the bright part of the adjacent contrast area, respectively calculating the difference value between the difference value and a preset threshold value, and if the weighted average of the difference values of the respectively calculated difference values and the preset threshold value is smaller than the preset weighted threshold value, determining that the corresponding group of reference shadow boundary lines corresponds to the actual shadow boundary line.
For example, referring to fig. 4, the reference shadow boundary 1 and the reference shadow boundary 2 may generate 4 contrast areas, and the shadow brightness of each contrast area is formed by overlapping the contributions of the reference street lamps. Separately calculating the brightness of the bright part of each region, e.g. the brightness G of the bright part of the contrast region 11=b/(r1*r1)+b/(r2*r2) I.e. street lamp l1For the brightness contribution of contrast zone 1, street light l2For theSum of luminance contributions of contrast region 1, r1Can express the center distance street lamp l of the vehicle1A distance of r2Can express the center distance street lamp l of the vehicle2The distance of (d); calculating the difference A of brightness of bright part of adjacent contrast areaiAnd calculating the difference A respectivelyiA difference value C from a preset threshold value Z, wherein the preset threshold value can be obtained by referring to the formula (1), and the contribution brightness w of the reference street lamp is used as the referenceiCarrying out weighted average calculation on the difference value C to obtain delta _ i, and if the delta _ i is smaller than a preset weighted threshold value ZcA corresponding set of reference shadow boundaries is determined corresponding to the actual shadow boundaries.
Wherein, in each contrast area, the distance r from the street lampiThe larger, wiThe smaller, the smaller may be takeni=a/riAnd a is a preset value, and can be set and adjusted according to expert experience or historical data calculation.
Step S22: and determining the set of reference shadow boundary lines corresponding to the actual shadow boundary lines, and determining the grid where the current vehicle is located according to the grid corresponding to the set of reference shadow boundary lines.
For example, similar to step S12, if a set of reference shaded borders (e.g., reference shaded border 1 and reference shaded border 2) correspond to two actual shaded borders in the acquired road surface image around the vehicle, the grid where the vehicle is located may be searched.
Step S23: and determining the relative position of the vehicle relative to the reference street lamp according to the grid where the reference street lamp is positioned and the grid where the current vehicle is positioned.
Step S140: and determining the absolute positioning of the vehicle according to the absolute position of the reference street lamp and the relative position of the vehicle relative to the reference street lamp.
By way of example, the absolute location of the vehicle, i.e., the longitude and latitude of the vehicle, can be determined after the map and the actual longitude and latitude are converted according to the absolute location of the reference street lamp, the longitude and latitude of the reference street lamp, and the relative location of the vehicle with respect to the reference street lamp.
Accordingly, the shadow-based vehicle positioning provided by the embodiment of the invention can realize the following technical effects:
1) the method is combined with the special street lamp and shadow information under the condition of night road conditions (or road conditions with poor illumination conditions, such as cloudy road conditions or tunnels and the like), and the high-precision positioning of the vehicle is carried out to assist the automatic driving.
2) The embodiment of the invention can make up the performance reduction of the vision-based positioning method caused by unclear lane lines or illumination problems under the road condition with poor illumination conditions, thereby improving the availability of the automatic driving function and improving the user experience.
The embodiment of the invention also provides a shadow-based vehicle positioning control device, which comprises: a memory, a processor and a computer program stored on the memory and executable on the processor, the processor executing the computer program to implement the shadow based vehicle positioning method of steps S110-S140.
Embodiments of the present invention also provide a machine-readable storage medium having stored thereon instructions for causing a machine to perform the shadow-based vehicle localization method according to steps S110-S140.
The embodiment of the invention also provides a vehicle which comprises the shadow-based vehicle positioning control device.
The shadow-based vehicle positioning control device, the machine-readable storage medium, and the content and effect of the vehicle provided by the embodiment of the invention are similar to those of the shadow-based vehicle positioning method provided by the embodiment of the invention, and are not repeated here.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). The memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (10)

1. A shadow-based vehicle localization method, the method comprising:
acquiring positioning information of a current vehicle and shadow images of surrounding road surfaces of the current vehicle on a lane with a street lamp;
acquiring the height and the absolute position of a reference street lamp according to the positioning information of the current vehicle, wherein the reference street lamp is a street lamp which enables the current vehicle to form the shadow image;
identifying an actual shadow boundary in the shadow image, and determining the relative position of the current vehicle relative to the reference street lamp by combining the height of the reference street lamp, wherein the actual shadow boundary is a bright-dark boundary in the shadow image; and
and determining the absolute positioning of the current vehicle according to the absolute position of the reference street lamp and the relative position of the current vehicle relative to the reference street lamp.
2. The shadow-based vehicle localization method of claim 1, wherein the obtaining of the height and absolute position of the reference street lamp according to the localization information of the current vehicle comprises:
acquiring a map containing the height and absolute position of each street lamp on a lane;
according to the positioning information of the current vehicle, determining the street lamp closest to the current vehicle position in the map as the reference street lamp;
and acquiring the height and the absolute position of the reference street lamp from the map.
3. The shadow-based vehicle localization method of claim 1, wherein the determining the relative position of the current vehicle with respect to the reference street light comprises:
according to grid searching and the three-dimensional model of the current vehicle, determining corresponding reference shadow boundary lines of the current vehicle at each grid position in the action range of the reference street lamp;
determining the reference shadow boundary corresponding to the actual shadow boundary, and determining the grid where the current vehicle is located according to the grid corresponding to the determined reference shadow boundary; and
and determining the relative position of the current vehicle relative to the reference street lamp according to the grid where the reference street lamp is located and the grid where the current vehicle is located.
4. The shadow-based vehicle localization method of claim 3, wherein the determining the reference shadow boundary corresponding to the actual shadow boundary comprises:
intercepting comparison areas with preset widths at two sides of any reference shadow boundary;
respectively calculating the brightness of a bright part and the brightness of a shadow part in the contrast area; and
calculating a difference between the brightness of the bright portion and the brightness of the shadow portion, and determining a reference shadow boundary, where the difference exceeds a preset threshold, as the reference shadow boundary corresponding to the actual shadow boundary.
5. The shadow-based vehicle localization method of claim 4, wherein the preset threshold is represented by:
Figure FDA0003145033280000021
wherein b represents a bright portion preset value, d represents a shadow portion preset value, r represents a distance from the center of the road surface image around the vehicle to the street lamp, and Z represents the preset threshold value.
6. The shadow-based vehicle localization method according to claim 1, wherein when the reference street lamp is plural, the determining the relative position of the current vehicle with respect to the reference street lamp comprises:
determining a group of reference shadow boundary lines corresponding to the current vehicle at each grid position in the action range of a plurality of reference street lamps according to grid search and the three-dimensional model of the current vehicle;
determining the set of reference shadow boundary lines corresponding to the actual shadow boundary lines, and determining the grid where the current vehicle is located according to the grid corresponding to the set of reference shadow boundary lines; and
and determining the relative position of the vehicle relative to the reference street lamp according to the grid where the reference street lamp is positioned and the grid where the current vehicle is positioned.
7. The shadow-based vehicle localization method of claim 6, wherein after obtaining the grid on which a set of reference shadow boundaries exist for the vehicle at each grid location, the shadow-based vehicle localization method further comprises:
for the corresponding group of reference shadow boundaries, intercepting comparison areas with preset widths at two sides of each reference shadow boundary in each group, wherein the group of reference shadow boundaries are crossed to generate a plurality of comparison areas;
respectively calculating the brightness of the bright part of each contrast area;
and calculating brightness difference values of bright parts of adjacent contrast areas, respectively calculating differences between the difference values and a preset threshold, and determining that the corresponding group of reference shadow boundary lines correspond to the actual shadow boundary line if weighted average of the differences between the respectively calculated difference values and the preset threshold is smaller than the preset weighted threshold.
8. A shadow-based vehicle positioning control apparatus, the control apparatus comprising: a memory, a processor and a computer program stored on the memory and executable on the processor, the processor executing the computer program to implement the shadow based vehicle localization method of any of claims 1-7.
9. A machine-readable storage medium having instructions stored thereon to cause a machine to perform the shadow based vehicle localization method according to any one of claims 1 to 7.
10. A vehicle comprising the shadow-based vehicle positioning control of claim 8.
CN202110748002.9A 2021-07-02 2021-07-02 Shadow-based vehicle positioning method and control device, storage medium and vehicle Pending CN113320546A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110748002.9A CN113320546A (en) 2021-07-02 2021-07-02 Shadow-based vehicle positioning method and control device, storage medium and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110748002.9A CN113320546A (en) 2021-07-02 2021-07-02 Shadow-based vehicle positioning method and control device, storage medium and vehicle

Publications (1)

Publication Number Publication Date
CN113320546A true CN113320546A (en) 2021-08-31

Family

ID=77425459

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110748002.9A Pending CN113320546A (en) 2021-07-02 2021-07-02 Shadow-based vehicle positioning method and control device, storage medium and vehicle

Country Status (1)

Country Link
CN (1) CN113320546A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116994354A (en) * 2023-09-28 2023-11-03 上海闪马智能科技有限公司 Road electric facility inspection method and device, storage medium and electronic device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102005056647A1 (en) * 2004-11-30 2006-06-01 Honda Motor Co., Ltd. Monitoring device for inspecting a motor vehicle's surroundings uses a picture taken by an infrared imaging-device to extract a body in the vehicle's surroundings as an object
DE102006037993A1 (en) * 2005-08-18 2007-04-26 GM Global Technology Operations, Inc., Detroit System and method for detecting a collision and predicting a vehicle path
CN102348306A (en) * 2010-08-03 2012-02-08 鸿富锦精密工业(深圳)有限公司 Intelligent street lamp control system and intelligent street lamp control method
CN104915642A (en) * 2015-05-26 2015-09-16 奇瑞汽车股份有限公司 Method and apparatus for measurement of distance to vehicle ahead
WO2018077263A1 (en) * 2016-10-31 2018-05-03 张舒怡 Sensor for automatic driving
CN109447003A (en) * 2018-10-31 2019-03-08 百度在线网络技术(北京)有限公司 Vehicle checking method, device, equipment and medium
CN110794844A (en) * 2019-11-18 2020-02-14 北京百度网讯科技有限公司 Automatic driving method, device, electronic equipment and readable storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102005056647A1 (en) * 2004-11-30 2006-06-01 Honda Motor Co., Ltd. Monitoring device for inspecting a motor vehicle's surroundings uses a picture taken by an infrared imaging-device to extract a body in the vehicle's surroundings as an object
DE102006037993A1 (en) * 2005-08-18 2007-04-26 GM Global Technology Operations, Inc., Detroit System and method for detecting a collision and predicting a vehicle path
CN102348306A (en) * 2010-08-03 2012-02-08 鸿富锦精密工业(深圳)有限公司 Intelligent street lamp control system and intelligent street lamp control method
CN104915642A (en) * 2015-05-26 2015-09-16 奇瑞汽车股份有限公司 Method and apparatus for measurement of distance to vehicle ahead
WO2018077263A1 (en) * 2016-10-31 2018-05-03 张舒怡 Sensor for automatic driving
CN109447003A (en) * 2018-10-31 2019-03-08 百度在线网络技术(北京)有限公司 Vehicle checking method, device, equipment and medium
CN110794844A (en) * 2019-11-18 2020-02-14 北京百度网讯科技有限公司 Automatic driving method, device, electronic equipment and readable storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116994354A (en) * 2023-09-28 2023-11-03 上海闪马智能科技有限公司 Road electric facility inspection method and device, storage medium and electronic device
CN116994354B (en) * 2023-09-28 2024-01-23 上海闪马智能科技有限公司 Road electric facility inspection method and device, storage medium and electronic device

Similar Documents

Publication Publication Date Title
JP6899368B2 (en) Methods and systems for generating and using localization reference data
KR20190094405A (en) Method and system for video-based positioning and mapping
JP2020500290A (en) Method and system for generating and using location reference data
WO2018113451A1 (en) Map data system, method for generating and using same, and application thereof
CN111830953A (en) Vehicle self-positioning method, device and system
CN111652060A (en) Laser radar-based height-limiting early warning method and device, electronic equipment and storage medium
CN109472864B (en) Generation method and device of elevation tile map
CN115235493B (en) Method and device for automatic driving positioning based on vector map
CN114067288A (en) Traffic sign extraction method and system, electronic equipment and storage medium
CN115493602A (en) Semantic map construction method and device, electronic equipment and storage medium
CN112446915B (en) Picture construction method and device based on image group
CN112749584B (en) Vehicle positioning method based on image detection and vehicle-mounted terminal
CN113320546A (en) Shadow-based vehicle positioning method and control device, storage medium and vehicle
CN111427331B (en) Perception information display method and device of unmanned vehicle and electronic equipment
CN114659513B (en) Unstructured road-oriented point cloud map construction and maintenance method
CN113535863B (en) Moving track rendering method and device and storage medium
CN114677458A (en) Road mark generation method and device for high-precision map, electronic equipment and storage medium
CN114643984A (en) Driving risk avoiding method, device, equipment, medium and product
CN112530270B (en) Mapping method and device based on region allocation
CN112556703A (en) Method, device and system for updating high-precision map
CN115187762B (en) Vehicle map rendering method and device, vehicle and storage medium
CN112991434B (en) Method for generating automatic driving traffic identification information and related device
CN116958915B (en) Target detection method, target detection device, electronic equipment and storage medium
US11294385B2 (en) System and method for generating a representation of an environment
CN116152355A (en) Road side camera calibration method and device and target detection method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210831

RJ01 Rejection of invention patent application after publication