CN116295490A - Vehicle positioning method and device, electronic equipment and storage medium - Google Patents

Vehicle positioning method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN116295490A
CN116295490A CN202310264672.2A CN202310264672A CN116295490A CN 116295490 A CN116295490 A CN 116295490A CN 202310264672 A CN202310264672 A CN 202310264672A CN 116295490 A CN116295490 A CN 116295490A
Authority
CN
China
Prior art keywords
vehicle
sides
distance
lane
positioning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310264672.2A
Other languages
Chinese (zh)
Inventor
李岩
费再慧
张海强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhidao Network Technology Beijing Co Ltd
Original Assignee
Zhidao Network Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhidao Network Technology Beijing Co Ltd filed Critical Zhidao Network Technology Beijing Co Ltd
Priority to CN202310264672.2A priority Critical patent/CN116295490A/en
Publication of CN116295490A publication Critical patent/CN116295490A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/04Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by terrestrial means
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

The application discloses a vehicle positioning method and device, electronic equipment and a storage medium, wherein the method comprises the steps of collecting image information below two sides of a vehicle; according to the image information, obtaining the distance between the vehicle and the lane lines on two sides; based on the vehicle positioning position, determining the actual position and the direction of the vehicle according to the distance between the vehicle and the lane lines on two sides, the width of the vehicle and the lane width in a high-precision map of the current position of the vehicle. By the method and the device, more accurate positioning information can be provided, and positioning limitation is overcome. Meanwhile, the effectiveness of the operation line of the automatic driving is judged or fed back according to the visual positioning result.

Description

Vehicle positioning method and device, electronic equipment and storage medium
Technical Field
The application relates to the technical field of automatic driving, in particular to a vehicle positioning method and device, electronic equipment and a storage medium.
Background
The automatic driving technology is mature increasingly, and the landing speed of vehicle types such as ROBOTAXI, ROBOBUS, unmanned delivery vehicles, sweeper and the like is accelerated. Although the running environments and the running speeds of the automatic driving vehicles of different vehicle types are different, the accuracy requirements of the downstream sensing, planning and control module on the position and the gesture provided by the upstream positioning module are in the centimeter level, and meanwhile, the stability is ensured.
The positioning scheme adopts a scheme of multi-sensor fusion and comprises the following steps: through a method based on filtering or optimization, the problem of positioning drift when GNSS signals receive interference is solved, so that the positioning of the full scene centimeter level is achieved.
Although the problem of positioning drift when GNSS signals are interfered can be well solved through multi-sensor fusion, the problems such as positioning limitation and unreasonable planning of related operation lines also exist.
Disclosure of Invention
The embodiment of the application provides a vehicle positioning method and device, electronic equipment and storage medium, so as to overcome positioning limitation.
The embodiment of the application adopts the following technical scheme:
in a first aspect, an embodiment of the present application provides a vehicle positioning method, where the method includes:
collecting image information below two sides of a vehicle;
according to the image information, obtaining the distance between the vehicle and the lane lines on two sides;
based on the vehicle positioning position, determining the actual position and the direction of the vehicle according to the distance between the vehicle and the lane lines on two sides, the width of the vehicle and the lane width in a high-precision map of the current position of the vehicle.
In some embodiments, the determining the actual position and orientation of the vehicle comprises:
projecting lanes in the high-precision map of the vehicle positioning position and the current position of the vehicle to a 2D plane;
projecting the distance between the vehicle and lane lines on two sides to the 2D plane for matching to obtain a transverse offset and a course angle offset;
and determining the actual position and the orientation of the vehicle according to the lateral offset and the course angle offset.
In some embodiments, the method further comprises:
judging whether the sum of the distance between the vehicle and the lane lines on two sides and the width of the vehicle meets the error condition between the sum and the lane width in a high-precision map of the current position of the vehicle;
and if the error condition is met, correcting the distance between the vehicle and the lane lines at the two sides.
In some embodiments, the method further comprises:
and under the condition that a lane line in a current driving road area of the vehicle is missing or is not available, taking image information on two sides of the vehicle and vehicle data as a visual odometer, and determining the current actual position and direction of the vehicle according to the visual odometer, wherein the vehicle data at least comprises one of the following vehicle body speed and course angle change rate Yawrate information.
In some embodiments, before the capturing the image information under both sides of the vehicle, the method further includes:
at least two cameras positioned below the left side and the right side of the vehicle are deployed and calibrated on the vehicle, the cameras face downwards, and the perception range of the cameras comprises the distance range of the vehicle body or the wheels from the lane lines on the two sides.
In some embodiments, after determining the actual position and orientation of the vehicle based on the vehicle positioning position according to the width of the vehicle and the lane width in the high-precision map of the current position of the vehicle, further comprising:
continuously recording the distance between the vehicle and lane lines on two sides in the process that the vehicle runs according to a preset operation line, and obtaining a distance statistical result;
and judging whether the preset operation line is reasonable or not and/or whether the high-precision map data of the current road section corresponding to the preset operation line is correct or not according to the distance statistics result.
In some embodiments, determining whether the preset operation line is reasonable includes:
judging whether the distance between the vehicle and the lane lines at two sides is continuously smaller than a positioning safety error range threshold value or not, if so, unreasonable;
judging whether the high-precision map data of the current road section corresponding to the preset operation line is correct or not comprises:
and if the error between the sum of the distance between the vehicle and the lane lines on two sides and the width of the vehicle and the lane width in the high-precision map of the current position of the vehicle exceeds a preset threshold value, updating the high-precision map data of the current road section.
In a second aspect, embodiments of the present application further provide a vehicle positioning device, where the device includes:
the image acquisition module is used for acquiring image information below two sides of the vehicle;
the distance acquisition module is used for acquiring the distance between the vehicle and the lane lines on two sides according to the image information;
and the positioning determining module is used for determining the actual position and the direction of the vehicle based on the vehicle positioning position according to the width of the vehicle and the width of the lane in the high-precision map of the current position of the vehicle.
In a third aspect, embodiments of the present application further provide an electronic device, including: a processor; and a memory arranged to store computer executable instructions that, when executed, cause the processor to perform the above method.
In a fourth aspect, embodiments of the present application also provide a computer-readable storage medium storing one or more programs that, when executed by an electronic device comprising a plurality of application programs, cause the electronic device to perform the above-described method.
The above-mentioned at least one technical scheme that this application embodiment adopted can reach following beneficial effect: and acquiring the distance between the vehicle and the lane lines on two sides by acquiring the image information below two sides of the vehicle and then acquiring the distance between the vehicle and the lane lines on two sides according to the image information. And further determining the actual position and the direction of the vehicle based on the vehicle positioning position according to the distance between the vehicle and the lane lines on two sides, the width of the vehicle and the lane width in a high-precision map of the current position of the vehicle. And correcting the transverse position of the vehicle by the distance between the vehicle body or the wheels perceived by the camera and the lane lines on two sides. The down-looking camera is adopted to replace the forward-looking camera to detect the lane lines, and the information such as the speed of the vehicle body, the change rate of the course angle and the like and the high-precision map data are combined to provide more accurate positioning information, so that the problem of position drift or jump when GNSS signals are lost is solved, and the positioning limitation of combined navigation is overcome.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute an undue limitation to the application. In the drawings:
FIG. 1 is a schematic flow chart of a vehicle positioning method according to an embodiment of the present application;
FIG. 2 is a schematic view of a vehicle positioning device according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of an electronic device in an embodiment of the present application.
Detailed Description
For the purposes, technical solutions and advantages of the present application, the technical solutions of the present application will be clearly and completely described below with reference to specific embodiments of the present application and corresponding drawings. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
The following describes in detail the technical solutions provided by the embodiments of the present application with reference to the accompanying drawings.
The embodiment of the application provides a vehicle positioning method, as shown in fig. 1, and provides a schematic flow chart of the vehicle positioning method in the embodiment of the application, where the method at least includes the following steps S110 to S130:
in step S110, image information of the lower sides of the vehicle is acquired.
When the image information below the two sides of the vehicle is acquired, the vehicle needs to be ensured to be at any position in the current lane, and the acquisition equipment can see lane lines on the two sides of the lane. This eliminates the need for performing a back projection transform or the like for the image information to be an original image. Therefore, the original image is used for replacing the bird's eye view after BEV conversion, and the extra error caused by camera parameters is eliminated.
And step S120, acquiring the distance between the vehicle and the lane lines on two sides according to the image information.
It can be appreciated that for the acquired image information, a conventional lane line recognition algorithm may be used to recognize only lane lines in one/more small areas, and no additional calculation is required, thereby ensuring instantaneity.
After the two-sided lane lines are identified, the calculated distance between the outer side of the front wheel of the vehicle and the inner side of the corresponding lane line can be further calculated. Of course, the vehicle may also be a rear wheel, and the camera needs to be correspondingly added at a position close to the rear wheel of the vehicle. However, the distance calculated from the front wheels of the vehicle is more accurate for determining the distance between the vehicle and the lane lines on both sides, so it is preferable to calculate the distance between the outside of the front wheels of the vehicle and the lane lines on both sides.
Step S130, based on the vehicle positioning position, determining the actual position and the direction of the vehicle according to the distance between the vehicle and the lane lines on two sides, the width of the vehicle and the lane width in a high-precision map of the current position of the vehicle.
The positioning result of the "vehicle positioning position" usually has an error and does not represent the current actual position of the vehicle, so that the actual position and orientation of the vehicle need to be calculated and determined according to the width of the vehicle, the width of the lane, and the distance between the vehicle and the lane lines on both sides.
It will be appreciated that for an actual position of the vehicle to be related to the vehicle body speed, the heading is related to the heading angle/the heading angle rate of change. Is available through a positioning navigation device on the vehicle.
The vehicle body or the distance between the wheels and the lane lines on two sides, which is perceived by the image information, corrects the transverse position of the vehicle, and replaces the BEV (Bird's Eye View) conversion result of the front-View camera by the original image perception result, thereby reducing conversion errors caused by camera parameters, vehicle body jolt and the like.
In comparison with the related art, when laser SLAM (time positioning and map construction, simultaneous Localization and Mapping, SLAM) is used for positioning, uncertain errors and poor stability can be caused due to laser radar calibration errors, signal pollution in a scene, and influence caused by unobvious structural features such as high speed and tunnels in the scene. Meanwhile, the laser radar is high in price and relatively high in mass production cost. The method for acquiring the image information is low in cost and less affected by the structural characteristics in the scene.
In comparison with the related art, when the visual SLAM is used for positioning, the visual SLAM is influenced by illumination, vehicle speed and the like, so that the visual SLAM is difficult to apply in a high-speed scene, and more recently, the perceived lane line information is matched with high-precision map data for lane level positioning. However, in the running process of the vehicle, errors of a sensing result and bumping conditions of the vehicle can influence matching, and errors of tens of centimeters are generated, so that frequent connection is caused by line pressing of the vehicle. The method for acquiring the image information is low in cost, the original perception result is used for replacing the BEV converted result of the front-view camera, conversion errors caused by camera parameters, car body jolting and the like are reduced, and therefore the overall perception error is small.
In one embodiment of the present application, the determining the actual position and orientation of the vehicle includes: projecting lanes in the high-precision map of the vehicle positioning position and the current position of the vehicle to a 2D plane; projecting the distance between the vehicle and lane lines on two sides to the 2D plane for matching to obtain a transverse offset and a course angle offset; and determining the actual position and the orientation of the vehicle according to the lateral offset and the course angle offset.
In the implementation, the two lane lines identified in the image information are required to be matched with the map data in the high-precision map of the corresponding area, so that the actual position and the direction of the vehicle are determined. When matching is carried out, the lane lines and lanes in the high-precision map are required to be projected to the same 2D plane for matching, and the offset is further calculated according to the matching result.
By way of example, with the position of the vehicle (own vehicle) as a reference, high-precision map data of the current position of the own vehicle is extracted and projected onto a 2D plane, and at the same time, two lane lines obtained by the correction are projected and matched on the 2D plane to obtain a lateral offset and a course angle offset, and the observation data is input into a filter for updating. The updated result is used as the positioning result of the current position of the vehicle.
In one embodiment of the present application, the method further comprises: judging whether the sum of the distance between the vehicle and the lane lines on two sides and the width of the vehicle meets the error condition between the sum and the lane width in a high-precision map of the current position of the vehicle; and if the error condition is met, correcting the distance between the vehicle and the lane lines at the two sides.
Judging whether the distance between the vehicle and the lane lines at two sides and the width of the vehicle meet the error condition (experience threshold value) between the vehicle and the lane width in a high-precision map of the current position of the vehicle, and if so, considering that the distance between the vehicle and the lane lines at two sides is within the error range, and correcting the distance between the vehicle and the lane lines at two sides.
It should be noted that the error condition needs to be satisfied to perform the position correction, otherwise, the map data may be wrong, and the map data verification needs to be performed.
Since the camera is directed downward due to the collection of image information under both sides of the vehicle, the visible range is small, and the recognized lane line can be approximately considered as a straight line, the lane line fitting is performed using a straight line equation of y=ax+b.
In the specific implementation, the distances L1 and L2 between the outer side of the front wheel and the corresponding lane line inner side need to be corrected according to the lane line parallel constraint and the lane line width equal constraint.
Theoretically, l1+l2+w_vehice=w_lane, but this equation may not be true because there may be an error in the high-precision map, and if the error on both sides of the equation is not within a reasonable range, such as 10cm-15cn, a subsequent map correctness verification process is required. Wherein, the vehicle width is denoted by W_vehicle, and the lane width in the high-precision map is denoted by W_lane.
In one embodiment of the present application, the method further comprises: and under the condition that a lane line in a current driving road area of the vehicle is missing or is not available, taking image information on two sides of the vehicle and vehicle data as a visual odometer, and determining the current actual position and direction of the vehicle according to the visual odometer, wherein the vehicle data at least comprises one of the following vehicle body speed and course angle change rate Yawrate information.
In the area where the lane lines are missing or are not available, the vehicle body data can be combined to serve as a visual odometer, additional observation information is provided for the filter, and the positioning stability under the condition of lane number change is ensured according to the additional observation information.
It can be understood that the vehicle body data of the vehicle is combined with two image information to be used as a visual odometer, so that the positioning stability under the condition of lane number change is ensured. Visual Odometry (VO) among robotic and computer vision problems, a Visual odometer is one of determining the position and attitude of a vehicle by analyzing and processing a sequence of related images.
Preferably, the vehicle displacement is calculated using inter-frame matching due to the 2D visual odometer. Alternatively, optimized open source frameworks such as Orbsram 2, vin-monos, and the like may also be used.
In one embodiment of the present application, before the capturing the image information under both sides of the vehicle, the method further includes: at least two cameras positioned below the left side and the right side of the vehicle are deployed and calibrated on the vehicle, the cameras face downwards, and the perception range of the cameras comprises the distance range of the vehicle body or the wheels from the lane lines on the two sides.
Before the image information below the two sides of the vehicle is collected, cameras are required to be installed on the two sides of the vehicle, two monocular cameras are required to be installed on the left side and the right side of the vehicle respectively, and meanwhile, a roof sensor bracket can be properly prolonged or installed below a rearview mirror, so that the camera can see lane lines on the two sides of a lane when the vehicle is at any position in a current lane.
Furthermore, data processing is required, so that processing time is saved, data delay is reduced, and lane line identification can be performed by using a lane line identification algorithm commonly used in the related art, namely according to gradient change information in an image. By using a traditional lane line recognition algorithm, only lane lines in a small area can be recognized, and instantaneity is ensured.
In one embodiment of the present application, after determining the actual position and orientation of the vehicle based on the vehicle positioning position according to the width of the vehicle and the lane width in the high-precision map of the current position of the vehicle, the method further includes: continuously recording the distance between the vehicle and lane lines on two sides in the process that the vehicle runs according to a preset operation line, and obtaining a distance statistical result; and judging whether the preset operation line is reasonable or not and/or whether the high-precision map data of the current road section corresponding to the preset operation line is correct or not according to the distance statistics result.
In the related art, the influence of errors of the sensor, errors of the algorithm and the width of a lane in an actual road is caused. If the positioning accuracy is not satisfied before in a narrow lane, even if a positioning result with higher accuracy is provided, the automatic driving vehicle can possibly compact the line to cause violation, and the operating line needs to be replaced. Meanwhile, if the error of the high-precision map in the operation area is too large, a correct result cannot be obtained when the lane line matching is performed.
In contrast to the problem of unreasonable automatic driving operation line caused in the related art, the accurate distance between the outer side of the front wheel and the inner side of the corresponding lane line is obtained through the previous judgment, so that the validity of the operation line and the correctness of map data can be verified by using the two distance information L1 and L2 with high accuracy, and the risks brought to the running of the vehicle due to the updating of a high-precision map and the excessive narrowing of the lane are reduced to the minimum. The method can record the adjusted values of L1 and L2 in the whole vehicle running process, and calculate whether the current operation line is reasonable to the current operation vehicle type or not through an off-line statistics method.
In an embodiment of the present application, determining whether the preset operation line is reasonable includes: judging whether the distance between the vehicle and the lane lines at two sides is continuously smaller than a positioning safety error range threshold value or not, if so, unreasonable; judging whether the high-precision map data of the current road section corresponding to the preset operation line is correct or not comprises: and if the error between the sum of the distance between the vehicle and the lane lines on two sides and the width of the vehicle and the lane width in the high-precision map of the current position of the vehicle exceeds a preset threshold value, updating the high-precision map data of the current road section.
In some embodiments, if the values of the distances L1 and L2 between the outer side of the front wheel and the inner side of the corresponding lane line calculated by the cameras on both sides are continuously smaller than the threshold value of the positioning safety error range, such as 60cm-70cm, on a certain road section, the operation route needs to be replaced to ensure the running safety of the vehicle.
In some embodiments, if the l1+l2+w_vehice and w_lane differ by more than a reasonable threshold for a certain period of time, the map data needs to be verified and updated.
The embodiment of the application further provides a vehicle positioning device 200, as shown in fig. 2, and a schematic structural diagram of the vehicle positioning device in the embodiment of the application is provided, where the vehicle positioning device 200 at least includes: an image acquisition module 210, a distance acquisition module 220, and a positioning determination module 230, wherein:
in one embodiment of the present application, the image acquisition module 210 is specifically configured to: image information is collected under both sides of the vehicle.
When the image information below the two sides of the vehicle is acquired, the vehicle needs to be ensured to be at any position in the current lane, and the acquisition equipment can see lane lines on the two sides of the lane. This eliminates the need for performing a back projection transform or the like for the image information to be an original image. Therefore, the original image is used for replacing the bird's eye view after BEV conversion, and the extra error caused by camera parameters is eliminated.
In one embodiment of the present application, the distance acquiring module 220 is specifically configured to: and acquiring the distance between the vehicle and the lane lines on two sides according to the image information.
It can be appreciated that for the acquired image information, a conventional lane line recognition algorithm may be used to recognize only lane lines in one/more small areas, and no additional calculation is required, thereby ensuring instantaneity.
After the two-sided lane lines are identified, the calculated distance between the outer side of the front wheel of the vehicle and the inner side of the corresponding lane line can be further calculated. Of course, the vehicle may also be a rear wheel, and the camera needs to be correspondingly added at a position close to the rear wheel of the vehicle. However, the distance calculated from the front wheels of the vehicle is more accurate for determining the distance between the vehicle and the lane lines on both sides, so it is preferable to calculate the distance between the outside of the front wheels of the vehicle and the lane lines on both sides.
In one embodiment of the present application, the location determining module 230 is specifically configured to: based on the vehicle positioning position, determining the actual position and the direction of the vehicle according to the distance between the vehicle and the lane lines on two sides, the width of the vehicle and the lane width in a high-precision map of the current position of the vehicle.
The positioning result of the "vehicle positioning position" usually has an error and does not represent the current actual position of the vehicle, so that the actual position and orientation of the vehicle need to be calculated and determined according to the width of the vehicle, the width of the lane, and the distance between the vehicle and the lane lines on both sides.
It will be appreciated that for an actual position of the vehicle to be related to the vehicle body speed, the heading is related to the heading angle/the heading angle rate of change. Is available through a positioning navigation device on the vehicle.
The vehicle body or the distance between the wheels and the lane lines on two sides, which is perceived by the image information, corrects the transverse position of the vehicle, and replaces the BEV (Bird's Eye View) conversion result of the front-View camera by the original image perception result, thereby reducing conversion errors caused by camera parameters, vehicle body jolt and the like.
It can be understood that the above-mentioned vehicle positioning device can implement each step of the vehicle positioning method provided in the foregoing embodiment, and the relevant explanation about the vehicle positioning method is applicable to the vehicle positioning device, which is not repeated herein.
Fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application. Referring to fig. 3, at the hardware level, the electronic device includes a processor, and optionally an internal bus, a network interface, and a memory. The Memory may include a Memory, such as a Random-Access Memory (RAM), and may further include a non-volatile Memory (non-volatile Memory), such as at least 1 disk Memory. Of course, the electronic device may also include hardware required for other services.
The processor, network interface, and memory may be interconnected by an internal bus, which may be an ISA (Industry Standard Architecture ) bus, a PCI (Peripheral Component Interconnect, peripheral component interconnect standard) bus, or EISA (Extended Industry Standard Architecture ) bus, among others. The buses may be classified as address buses, data buses, control buses, etc. For ease of illustration, only one bi-directional arrow is shown in FIG. 3, but not only one bus or type of bus.
And the memory is used for storing programs. In particular, the program may include program code including computer-operating instructions. The memory may include memory and non-volatile storage and provide instructions and data to the processor.
The processor reads the corresponding computer program from the nonvolatile memory into the memory and then runs the computer program to form the vehicle positioning device on the logic level. The processor is used for executing the programs stored in the memory and is specifically used for executing the following operations:
collecting image information below two sides of a vehicle;
according to the image information, obtaining the distance between the vehicle and the lane lines on two sides;
based on the vehicle positioning position, determining the actual position and the direction of the vehicle according to the distance between the vehicle and the lane lines on two sides, the width of the vehicle and the lane width in a high-precision map of the current position of the vehicle.
The method performed by the vehicle positioning device disclosed in the embodiment shown in fig. 1 of the present application may be applied to a processor or implemented by a processor. The processor may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or by instructions in the form of software. The processor may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; but also digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present application may be embodied directly in hardware, in a decoded processor, or in a combination of hardware and software modules in a decoded processor. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in a memory, and the processor reads the information in the memory and, in combination with its hardware, performs the steps of the above method.
The electronic device may also execute the method executed by the vehicle positioning device in fig. 1, and implement the functions of the vehicle positioning device in the embodiment shown in fig. 1, which is not described herein again.
The embodiments of the present application also provide a computer-readable storage medium storing one or more programs, where the one or more programs include instructions, which when executed by an electronic device that includes a plurality of application programs, enable the electronic device to perform a method performed by the vehicle positioning apparatus in the embodiment shown in fig. 1, and specifically are configured to perform:
collecting image information below two sides of a vehicle;
according to the image information, obtaining the distance between the vehicle and the lane lines on two sides;
based on the vehicle positioning position, determining the actual position and the direction of the vehicle according to the distance between the vehicle and the lane lines on two sides, the width of the vehicle and the lane width in a high-precision map of the current position of the vehicle.
It will be appreciated by those skilled in the art that embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The foregoing is merely exemplary of the present application and is not intended to limit the present application. Various modifications and changes may be made to the present application by those skilled in the art. Any modifications, equivalent substitutions, improvements, etc. which are within the spirit and principles of the present application are intended to be included within the scope of the claims of the present application.

Claims (10)

1. A vehicle positioning method, wherein the method comprises:
collecting image information below two sides of a vehicle;
according to the image information, obtaining the distance between the vehicle and the lane lines on two sides;
based on the vehicle positioning position, determining the actual position and the direction of the vehicle according to the distance between the vehicle and the lane lines on two sides, the width of the vehicle and the lane width in a high-precision map of the current position of the vehicle.
2. The method of claim 1, wherein the determining the actual position and orientation of the vehicle comprises:
projecting lanes in the high-precision map of the vehicle positioning position and the current position of the vehicle to a 2D plane;
projecting the distance between the vehicle and lane lines on two sides to the 2D plane for matching to obtain a transverse offset and a course angle offset;
and determining the actual position and the orientation of the vehicle according to the lateral offset and the course angle offset.
3. The method of claim 2, wherein the method further comprises:
judging whether the sum of the distance between the vehicle and the lane lines on two sides and the width of the vehicle meets the error condition between the sum and the lane width in a high-precision map of the current position of the vehicle;
and if the error condition is met, correcting the distance between the vehicle and the lane lines at the two sides.
4. The method of claim 2, wherein the method further comprises:
and under the condition that a lane line in a current driving road area of the vehicle is missing or is not available, taking image information on two sides of the vehicle and vehicle data as a visual odometer, and determining the current actual position and direction of the vehicle according to the visual odometer, wherein the vehicle data at least comprises one of the following vehicle body speed and course angle change rate Yawrate information.
5. The method of claim 1, wherein prior to the acquiring the image information under both sides of the vehicle, further comprising:
at least two cameras positioned below the left side and the right side of the vehicle are deployed and calibrated on the vehicle, the cameras face downwards, and the perception range of the cameras comprises the distance range of the vehicle body or the wheels from the lane lines on the two sides.
6. The method of claim 1, wherein determining the actual position and orientation of the vehicle based on the vehicle location position from the width of the vehicle and the lane width in the high-precision map of the current location of the vehicle further comprises:
continuously recording the distance between the vehicle and lane lines on two sides in the process that the vehicle runs according to a preset operation line, and obtaining a distance statistical result;
and judging whether the preset operation line is reasonable or not and/or whether the high-precision map data of the current road section corresponding to the preset operation line is correct or not according to the distance statistics result.
7. The method of claim 6, wherein:
judging whether the preset operation line is reasonable or not comprises:
judging whether the distance between the vehicle and the lane lines at two sides is continuously smaller than a positioning safety error range threshold value or not, if so, unreasonable;
judging whether the high-precision map data of the current road section corresponding to the preset operation line is correct or not comprises:
and if the error between the sum of the distance between the vehicle and the lane lines on two sides and the width of the vehicle and the lane width in the high-precision map of the current position of the vehicle exceeds a preset threshold value, updating the high-precision map data of the current road section.
8. A vehicle positioning device, wherein the device comprises:
the image acquisition module is used for acquiring image information below two sides of the vehicle;
the distance acquisition module is used for acquiring the distance between the vehicle and the lane lines on two sides according to the image information;
and the positioning determining module is used for determining the actual position and the direction of the vehicle based on the vehicle positioning position according to the width of the vehicle and the width of the lane in the high-precision map of the current position of the vehicle.
9. An electronic device, comprising:
a processor; and
a memory arranged to store computer executable instructions which, when executed, cause the processor to perform the method of any of claims 1 to 7.
10. A computer readable storage medium storing one or more programs, which when executed by an electronic device comprising a plurality of application programs, cause the electronic device to perform the method of any of claims 1-7.
CN202310264672.2A 2023-03-17 2023-03-17 Vehicle positioning method and device, electronic equipment and storage medium Pending CN116295490A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310264672.2A CN116295490A (en) 2023-03-17 2023-03-17 Vehicle positioning method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310264672.2A CN116295490A (en) 2023-03-17 2023-03-17 Vehicle positioning method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116295490A true CN116295490A (en) 2023-06-23

Family

ID=86797436

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310264672.2A Pending CN116295490A (en) 2023-03-17 2023-03-17 Vehicle positioning method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116295490A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117011827A (en) * 2023-07-11 2023-11-07 禾多科技(北京)有限公司 Method, apparatus, device and computer readable medium for detecting longitudinal distance of obstacle

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117011827A (en) * 2023-07-11 2023-11-07 禾多科技(北京)有限公司 Method, apparatus, device and computer readable medium for detecting longitudinal distance of obstacle

Similar Documents

Publication Publication Date Title
EP3470789A1 (en) Autonomous driving support apparatus and method
CN110969059A (en) Lane line identification method and system
JP2021117048A (en) Change point detector and map information delivery system
CN114705121B (en) Vehicle pose measurement method and device, electronic equipment and storage medium
CN115390086B (en) Fusion positioning method and device for automatic driving, electronic equipment and storage medium
CN116295490A (en) Vehicle positioning method and device, electronic equipment and storage medium
US11908206B2 (en) Compensation for vertical road curvature in road geometry estimation
CN115950441B (en) Fusion positioning method and device for automatic driving vehicle and electronic equipment
CN115856979B (en) Positioning method and device for automatic driving vehicle, electronic equipment and storage medium
CN116682091A (en) Obstacle sensing method and device for automatic driving vehicle
CN111605481A (en) Congestion car following system and terminal based on look around
CN110539748A (en) congestion car following system and terminal based on look around
CN116148821A (en) Laser radar external parameter correction method and device, electronic equipment and storage medium
CN112598314B (en) Method, device, equipment and medium for determining perception confidence of intelligent driving automobile
CN115031755A (en) Automatic driving vehicle positioning method and device, electronic equipment and storage medium
CN115112125A (en) Positioning method and device for automatic driving vehicle, electronic equipment and storage medium
US11816903B2 (en) Method for determining a type of parking space
JP7452374B2 (en) Object detection device and object detection program
CN111256707A (en) Congestion car following system and terminal based on look around
CN116481548B (en) Positioning method and device for automatic driving vehicle and electronic equipment
CN116559899A (en) Fusion positioning method and device for automatic driving vehicle and electronic equipment
US20240078814A1 (en) Method and apparatus for modeling object, storage medium, and vehicle control method
CN116895061A (en) Lane line processing method and device
CN118068357A (en) Road edge fusion processing method and device, electronic equipment and storage medium
CN116823931A (en) Method and device for calculating initial course angle of vehicle and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination