CN116222586A - Fusion positioning method and device for automatic driving vehicle and electronic equipment - Google Patents

Fusion positioning method and device for automatic driving vehicle and electronic equipment Download PDF

Info

Publication number
CN116222586A
CN116222586A CN202310027780.8A CN202310027780A CN116222586A CN 116222586 A CN116222586 A CN 116222586A CN 202310027780 A CN202310027780 A CN 202310027780A CN 116222586 A CN116222586 A CN 116222586A
Authority
CN
China
Prior art keywords
sensor
positioning
data
error model
positioning data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310027780.8A
Other languages
Chinese (zh)
Inventor
费再慧
李岩
张海强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhidao Network Technology Beijing Co Ltd
Original Assignee
Zhidao Network Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhidao Network Technology Beijing Co Ltd filed Critical Zhidao Network Technology Beijing Co Ltd
Priority to CN202310027780.8A priority Critical patent/CN116222586A/en
Publication of CN116222586A publication Critical patent/CN116222586A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/183Compensation of inertial measurements, e.g. for temperature effects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/47Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Manufacturing & Machinery (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Navigation (AREA)

Abstract

The application discloses a fusion positioning method, a fusion positioning device and electronic equipment of an automatic driving vehicle, wherein the method comprises the following steps: acquiring first positioning data of a plurality of sensors of the autonomous vehicle; calibrating each sensor by using a preset calibration strategy according to the first positioning data of each sensor to obtain a positioning error model of each sensor; compensating the second positioning data of each sensor by using the positioning error model of each sensor to obtain compensated second positioning data of each sensor; and carrying out fusion positioning according to the compensated second positioning data of each sensor to obtain a fusion positioning result of the automatic driving vehicle. According to the fusion positioning method for the automatic driving vehicle, positioning errors of different sensors are calibrated, so that positioning data of the different sensors are compensated and then fused, and fusion positioning accuracy and stability are improved.

Description

Fusion positioning method and device for automatic driving vehicle and electronic equipment
Technical Field
The application relates to the technical field of automatic driving, in particular to a fusion positioning method and device for an automatic driving vehicle and electronic equipment.
Background
The automatic driving vehicle needs to ensure the driving safety and stability under various complex scenes, needs high-precision positioning information as support, is difficult to realize under various complex scenes by means of a single sensor, and often needs to fuse and position by means of multiple sensor information, such as fusing the positioning information of the sensors of a laser radar, a vision camera, an IMU (Inertial Measurement Unit, an inertial measurement unit) and the like.
However, when the multi-sensor information is fused, the positioning results of different sensors have errors of different degrees, and the direct fusion also leads to larger errors of the final fusion positioning result, so that the positioning accuracy and the positioning stability of the automatic driving vehicle are reduced.
Disclosure of Invention
The embodiment of the application provides a fusion positioning method and device for an automatic driving vehicle and electronic equipment, so as to improve the fusion positioning precision and positioning stability of the automatic driving vehicle.
The embodiment of the application adopts the following technical scheme:
in a first aspect, an embodiment of the present application provides a fusion positioning method for an autopilot vehicle, where the method includes:
Acquiring first positioning data of a plurality of sensors of the autonomous vehicle;
calibrating each sensor by using a preset calibration strategy according to the first positioning data of each sensor to obtain a positioning error model of each sensor;
compensating the second positioning data of each sensor by using the positioning error model of each sensor to obtain compensated second positioning data of each sensor;
and carrying out fusion positioning according to the compensated second positioning data of each sensor to obtain a fusion positioning result of the automatic driving vehicle.
Optionally, the acquiring the first positioning data of the plurality of sensors of the autonomous vehicle includes:
acquiring current state data of an automatic driving vehicle, and determining whether the automatic driving vehicle meets a preset calibration condition according to the current state data;
and under the condition that the automatic driving vehicle meets the preset calibration condition, acquiring first positioning data of a plurality of sensors of the automatic driving vehicle.
Optionally, the determining whether the automatic driving vehicle meets the preset calibration condition according to the current state data includes:
acquiring the current speed of the automatic driving vehicle, RTK positioning data and IMU original data;
Determining whether the automatic driving vehicle is in a motion state according to the current speed and/or the IMU original data, determining whether an RTK positioning state is in a fixed solution state according to the RTK positioning data, and determining whether the automatic driving vehicle is in a straight-going state according to the IMU original data;
determining that the autonomous vehicle meets the preset calibration condition when the autonomous vehicle is in a motion state, the RTK positioning state is a fixed solution state and the autonomous vehicle is in a straight-going state;
otherwise, determining that the automatic driving vehicle does not meet the preset calibration condition.
Optionally, the positioning error model includes a course angle error model, the first positioning data of each sensor includes an original course angle corresponding to each sensor, and the calibrating each sensor by using a preset calibration strategy according to the first positioning data of each sensor, so as to obtain the positioning error model of each sensor includes:
calculating a reference course angle corresponding to each sensor according to a preset distance condition;
and determining a course angle error model of each sensor according to the original course angle corresponding to each sensor and the reference course angle corresponding to each sensor.
Optionally, the positioning error model includes a speed error model, the first positioning data of each sensor includes an original speed corresponding to each sensor, and the calibrating each sensor by using a preset calibration policy according to the first positioning data of each sensor, to obtain the positioning error model of each sensor includes:
acquiring RTK positioning data corresponding to each sensor, wherein the RTK positioning data comprise RTK positioning speed;
and determining a speed error model corresponding to each sensor according to the original speed corresponding to each sensor and the corresponding RTK positioning speed.
Optionally, after calibrating each sensor by using a preset calibration strategy according to the first positioning data of each sensor to obtain a positioning error model of each sensor, the method further includes:
updating the positioning error model of each sensor according to preset updating conditions to obtain an updated positioning error model of each sensor, wherein the preset updating conditions comprise preset distance intervals and/or preset time intervals.
Optionally, the positioning error model includes a lateral position error model, the first positioning data of each sensor includes original position information corresponding to the vision sensor, and according to the first positioning data of each sensor, calibrating each sensor by using a preset calibration strategy, the obtaining the positioning error model of each sensor includes:
Acquiring RTK positioning data, wherein the RTK positioning data comprises an RTK positioning position;
determining a total position error corresponding to the vision sensor according to the original position information corresponding to the vision sensor and the RTK positioning position, wherein the total position error comprises a transverse position error and a longitudinal position error;
eliminating longitudinal position errors in the total position errors corresponding to the visual sensor by using a preset eliminating strategy to obtain transverse position errors corresponding to the visual sensor;
and determining the transverse position error model according to the transverse position error corresponding to the visual sensor.
In a second aspect, embodiments of the present application further provide a fusion positioning device for an autonomous vehicle, where the device includes:
an acquisition unit for acquiring first positioning data of a plurality of sensors of an autonomous vehicle;
the calibration unit is used for calibrating each sensor by utilizing a preset calibration strategy according to the first positioning data of each sensor to obtain a positioning error model of each sensor;
the compensation unit is used for respectively compensating the second positioning data of each sensor by utilizing the positioning error model of each sensor to obtain compensated second positioning data of each sensor;
And the fusion positioning unit is used for carrying out fusion positioning according to the compensated second positioning data of each sensor to obtain a fusion positioning result of the automatic driving vehicle.
In a third aspect, embodiments of the present application further provide an electronic device, including:
a processor; and
a memory arranged to store computer executable instructions which, when executed, cause the processor to perform any of the methods described hereinbefore.
In a fourth aspect, embodiments of the present application also provide a computer-readable storage medium storing one or more programs that, when executed by an electronic device comprising a plurality of application programs, cause the electronic device to perform any of the methods described above.
The above-mentioned at least one technical scheme that this application embodiment adopted can reach following beneficial effect: according to the fusion positioning method of the automatic driving vehicle, first positioning data of a plurality of sensors of the automatic driving vehicle are acquired; then calibrating each sensor by using a preset calibration strategy according to the first positioning data of each sensor to obtain a positioning error model of each sensor; then, compensating the second positioning data of each sensor by using the positioning error model of each sensor to obtain compensated second positioning data of each sensor; and finally, carrying out fusion positioning according to the compensated second positioning data of each sensor to obtain a fusion positioning result of the automatic driving vehicle. According to the fusion positioning method for the automatic driving vehicle, positioning errors of different sensors are calibrated, so that positioning data of the different sensors are compensated and then fused, and fusion positioning accuracy and positioning stability are improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute an undue limitation to the application. In the drawings:
fig. 1 is a flow chart of a fusion positioning method of an automatic driving vehicle in an embodiment of the application;
FIG. 2 is a schematic structural diagram of a fusion positioning device for an autonomous vehicle according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of an electronic device in an embodiment of the present application.
Detailed Description
For the purposes, technical solutions and advantages of the present application, the technical solutions of the present application will be clearly and completely described below with reference to specific embodiments of the present application and corresponding drawings. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
The following describes in detail the technical solutions provided by the embodiments of the present application with reference to the accompanying drawings.
The embodiment of the application provides a fusion positioning method of an automatic driving vehicle, as shown in fig. 1, and provides a flow chart of the fusion positioning method of the automatic driving vehicle in the embodiment of the application, where the method at least includes the following steps S110 to S140:
step S110, first positioning data of a plurality of sensors of an autonomous vehicle is acquired.
When fusion positioning of the autonomous vehicle is performed, first positioning data of a plurality of sensors loaded on the autonomous vehicle need to be acquired first, and the first positioning data may include, for example, a point cloud positioning result corresponding to a laser radar, a visual positioning result corresponding to a visual camera, a positioning result of a wheel speed meter, a positioning result of an IMU, and the like.
And step S120, calibrating each sensor by using a preset calibration strategy according to the first positioning data of each sensor to obtain a positioning error model of each sensor.
After the first positioning data of each sensor is obtained, the positioning error of each sensor can be calibrated by utilizing a preset calibration strategy based on the positioning result of each sensor, and because fusion positioning is required to ensure that the position used for fusion is accurate and the information such as speed, gesture and the like is also required to be accurate enough, the positioning error can comprise, for example, the position error, the speed error, the heading angle error and the like, so that a positioning error model of each sensor is obtained. The preset calibration strategy may be, for example, to use high-precision positioning information to measure errors in positioning data of the respective sensors. The positioning error model can be obtained by constructing the positioning errors generated by each sensor within a period of time or a distance so as to ensure the accuracy and the robustness of the positioning error model.
And step S130, respectively compensating the second positioning data of each sensor by using the positioning error model of each sensor to obtain compensated second positioning data of each sensor.
In the actual positioning stage, the second positioning data generated by each sensor can be respectively compensated by using the positioning error model of each sensor calibrated in the previous step, namely, the errors of the positioning data are compensated, so that the precision of the positioning result of each sensor is improved.
And step S140, fusion positioning is carried out according to the compensated second positioning data of each sensor, and a fusion positioning result of the automatic driving vehicle is obtained.
After the compensated second positioning data of each sensor is obtained, fusion processing can be performed on the compensated positioning data of the plurality of sensors by using a preset fusion positioning algorithm such as Kalman filtering or extended Kalman filtering, so as to obtain a high-precision fusion positioning result.
According to the fusion positioning method for the automatic driving vehicle, positioning errors of different sensors are calibrated, so that positioning data of the different sensors are compensated and then fused, and fusion positioning accuracy and stability are improved.
In some embodiments of the present application, the acquiring first positioning data for a plurality of sensors of an autonomous vehicle includes: acquiring current state data of an automatic driving vehicle, and determining whether the automatic driving vehicle meets a preset calibration condition according to the current state data; and under the condition that the automatic driving vehicle meets the preset calibration condition, acquiring first positioning data of a plurality of sensors of the automatic driving vehicle.
In order to ensure the accuracy of the positioning error calibration of each sensor, the calibration of the positioning error can be performed under the condition that the current state of the automatic driving vehicle meets certain calibration conditions, so that the current state data of the automatic driving vehicle can be acquired first, the current state data can reflect the current running state, positioning state and other information of the automatic driving vehicle, and further whether the automatic driving vehicle is suitable for calibration at present can be judged.
The calibration aims at determining the positioning error model of each sensor, so that the positioning error model can be used as the basis for compensating the positioning error subsequently, and the preset calibration conditions are mainly used for restraining the automatic driving vehicle to be in a better state as far as possible, so that the more accurate positioning error model can be calibrated on line. Therefore, the actual application stage of the positioning error model is not influenced by the RTK positioning state, namely even if satellite positioning signals are poor, the positioning error compensation can be carried out by using the calibrated positioning error model.
In some embodiments of the present application, the determining whether the autonomous vehicle satisfies a preset calibration condition according to the current state data includes: acquiring the current speed of the automatic driving vehicle, RTK positioning data and IMU original data; determining whether the automatic driving vehicle is in a motion state according to the current speed and/or the IMU original data, determining whether an RTK positioning state is in a fixed solution state according to the RTK positioning data, and determining whether the automatic driving vehicle is in a straight-going state according to the IMU original data; determining that the autonomous vehicle meets the preset calibration condition when the autonomous vehicle is in a motion state, the RTK positioning state is a fixed solution state and the autonomous vehicle is in a straight-going state; otherwise, determining that the automatic driving vehicle does not meet the preset calibration condition.
When judging whether the automatic driving vehicle meets the preset calibration conditions, the embodiment of the application can judge from the following aspects:
the first aspect is that the judgment of the running state of the automatic driving vehicle can be judged by acquiring the current chassis speed of the automatic driving vehicle, if the chassis speed is greater than a preset speed threshold value, the vehicle is considered to be in normal motion dynamic state, otherwise, the vehicle is possibly in a static state or just enters the motion state from the static state, and the situation that the vehicle just enters the motion state from the normal motion state is distinguished from the situation that the vehicle just enters the motion state because the error optimization logic when the vehicle just enters the motion state is different from the error optimization logic when the vehicle already formally enters the motion state which is relatively stable, the embodiment of the application is mainly aimed at the calibration of the positioning error when the vehicle already formally enters the motion state which is relatively stable, and therefore, the preset speed threshold value can be set to be a value which is greater than 0, such as 1m/s, and the motion state and the static state are distinguished by 0.
In addition, it should be noted that, in some cases, the current speed of the automatic driving vehicle may not be obtained, and the motion state may be determined by using the IMU raw data, or, of course, the current speed and the IMU raw data may be combined to determine comprehensively.
The second aspect is the judgment of the RTK positioning state of the automatic driving vehicle, and since the RTK positioning result is used as the reference for measuring the positioning errors of other sensors, the accuracy of the RTK positioning result needs to be ensured, and the accuracy of the positioning error calculation of other sensors can be ensured. The method can judge whether the current RTK positioning position is available according to whether the RTK positioning state is a fixed solution, and if the RTK positioning state is the fixed solution, the RTK positioning position can reach the accuracy of centimeter level, namely the position is accurate enough.
The third aspect is that whether the automatic driving vehicle is in a straight running state or not is judged, if the automatic driving vehicle is in a non-straight running state, for example, in the case of turning, positioning errors such as heading angle error calibration accuracy can be affected, and the referenceability of a calibration result is reduced, so that subsequent on-line calibration can be performed in the state that the vehicle is in a straight running state, thereby ensuring the accuracy of the positioning error calibration, and the straight running state can be judged according to the angular speed and the like in IMU data.
Based on the above three aspects, if it is determined that the autonomous vehicle is in a normal motion state, the RTK positioning state is a fixed solution, and the autonomous vehicle is in a straight-going state, the preset calibration condition may be considered to be satisfied, otherwise, not satisfied. The above three aspects of judgment are not limited in strict sequence, and may be synchronous judgment or asynchronous judgment.
In some embodiments of the present application, the positioning error model includes a course angle error model, the first positioning data of each sensor includes an original course angle corresponding to each sensor, and according to the first positioning data of each sensor, calibrating each sensor by using a preset calibration policy, obtaining the positioning error model of each sensor includes: calculating a reference course angle corresponding to each sensor according to a preset distance condition; and determining a course angle error model of each sensor according to the original course angle corresponding to each sensor and the reference course angle corresponding to each sensor.
In the fusion positioning stage, the fusion of attitude information such as a course angle is an important dimension, and if the attitude error is large, the attitude control and positioning of the automatic driving vehicle are greatly influenced, so that the positioning error model of the embodiment of the application can comprise the course angle error model. As the sensors such as the laser radar, the vision camera and the IMU output course angle information, the course angle error models of the sensors can be calibrated respectively.
When the course angle error model of each sensor is calibrated, the reference course angle data corresponding to each sensor can be calculated according to the defined preset distance condition, the reference course angle data mainly comes from RTK positioning results, and the specific calculation can be based on the existing course angle calculation mode, for example, the size of the RTK course angle can be calculated according to the conversion of angle radian and the Euclidean distance between two RTK positioning positions.
Because the calculation of the course angle is related to the distance between two points, the calculation of the course angle is inaccurate due to the fact that the distance is too small or too large, the embodiment of the application defines the preset distance condition, the preset distance condition characterizes how far apart the calculation of the reference course angle data is, namely, the calculation of the reference course angle data is restrained, for example, the calculation of the reference course angle data can be set to be carried out every 3-5 meters, the calculation accuracy of the reference course angle data is guaranteed, and the calculation can be further used as a reference for measuring the course angle deviation of other sensors. The preset distance condition may include, in addition to a distance interval setting, a total distance setting, for example, setting to cache RTK positioning data of an autonomous vehicle within a distance of, for example, 500 meters and calculate a plurality of reference heading angle data.
And then, the original course angle corresponding to each sensor can be respectively compared with the corresponding reference course angle, a plurality of course angle errors corresponding to each sensor are calculated, and finally, a certain fitting algorithm can be used for fitting the plurality of course angle errors corresponding to each sensor, so that a course angle error model corresponding to each sensor can be obtained, the course angle error model can represent the corresponding relation between the original course angle and the reference course angle, and can also represent the corresponding relation between the original course angle and the course angle error, and the method can be set according to actual requirements by a person skilled in the art.
And when the second positioning data of the automatic driving vehicle is acquired, the course angle in the second positioning data of each sensor can be compensated by using the calibrated course angle error model, and the course angle observed values after the compensation of each sensor are used for fusion positioning, so that the accuracy of course angle information fusion is improved.
In some embodiments of the present application, the positioning error model includes a speed error model, the first positioning data of each sensor includes an original speed corresponding to each sensor, and according to the first positioning data of each sensor, calibrating each sensor by using a preset calibration policy, the obtaining the positioning error model of each sensor includes: acquiring RTK positioning data corresponding to each sensor, wherein the RTK positioning data comprise RTK positioning speed; and determining a speed error model corresponding to each sensor according to the original speed corresponding to each sensor and the corresponding RTK positioning speed.
In the stage of fusion positioning, a dimension which is very important for the fusion of speed information is also that if the speed error is large, the positioning of the automatic driving vehicle can be caused to generate large error, so that the positioning error model of the embodiment of the application can comprise the speed error model. Since the speed information is mainly output by sensors such as a laser radar, a wheel speed meter and the like, the speed error models of the sensors can be calibrated respectively.
When the speed error models corresponding to the laser radar and the wheel speed meter are calibrated, the speed error v_s of the laser radar and the speed error v_o of the wheel speed meter can be respectively calculated by taking the speed V_ RTK contained in the RTK positioning result as a reference, then the coefficients ks and bs of the speed error model of the laser radar and the coefficients ko and bo of the speed error model of the wheel speed meter are optimized or calibrated on line through least square or LM (Levenberg-Marquardt ) algorithm and the like, and finally the obtained speed error model of the laser radar and the speed error model of the wheel speed meter can be respectively expressed as follows:
V_rtk=ks*v_s+bs
V_rtk=ko*v_o+bo
and when the second positioning data of the automatic driving vehicle is acquired, the speed error model can be utilized to compensate the speed in the second positioning data of each sensor, and the speed observed value compensated by each sensor is used for fusion positioning, so that the accuracy of speed information fusion is improved.
In some embodiments of the present application, after calibrating each sensor according to the first positioning data of each sensor by using a preset calibration policy, the method further includes: updating the positioning error model of each sensor according to preset updating conditions to obtain an updated positioning error model of each sensor, wherein the preset updating conditions comprise preset distance intervals and/or preset time intervals.
Because the positioning error model calibrated by the embodiment is constructed based on the positioning data within a certain distance or a certain period of time, the parameters of the model may also change along with the change of time, actual scenes and the like, so that the positioning error model of each sensor can be updated according to a certain updating condition to ensure the accuracy of the positioning error model of each sensor.
The update condition may be defined, for example, as that the update is performed once every preset distance, such as 10km, or once every time, such as 10min, corresponding to the periodic forgetting history information, and the positioning error model is adaptively constructed by using only the latest data.
In some embodiments of the present application, the positioning error model includes a lateral position error model, the first positioning data of each sensor includes original position information corresponding to the vision sensor, calibrating each sensor by using a preset calibration policy according to the first positioning data of each sensor, and obtaining the positioning error model of each sensor includes: acquiring RTK positioning data, wherein the RTK positioning data comprises an RTK positioning position; determining a total position error corresponding to the vision sensor according to the original position information corresponding to the vision sensor and the RTK positioning position, wherein the total position error comprises a transverse position error and a longitudinal position error; eliminating longitudinal position errors in the total position errors corresponding to the visual sensor by using a preset eliminating strategy to obtain transverse position errors corresponding to the visual sensor; and determining the transverse position error model according to the transverse position error corresponding to the visual sensor.
The positioning data of the sensor includes information such as heading angle and speed, and also includes position information, for example, position information output by a vision camera, so that besides the heading angle error model and the speed error model constructed in the foregoing embodiments, the embodiment of the application may also construct a position error model for position errors.
When a transverse position error model is constructed based on the position information output by the vision camera, because the position information output by the vision camera is an X-Y-Z coordinate value under a UTM (Universal Transverse Mercator Grid System, universal transverse ink card grid system) coordinate system, the total position error obtained after the difference between the position information and the current RTK position directly contains both the transverse position error and the longitudinal position error, and therefore, the longitudinal position error in the total position error can be removed, namely, the transverse position error is separated from the total position error, for example, the following removal strategy can be adopted to realize:
vio_dp.x()=_time_vio_data.x()-_time_pos_data.x();
vio_dp.y()=_time_vio_data.y()-_time_pos_data.y();
_vio_dp=sqrt(vio_dp.x()*vio_dp.x()+vio_dp.y()*vio_dp.y());
double pos_yaw=-atan2(vio_dp.x(),vio_dp.y())+M_PI*0.5;
_vio_dp’=abs(sin(pos_yaw-ori_yaw))*_vio_dp;//this make sure only use val lat data;
the method comprises the steps of (a) respectively representing an XY axis position coordinate output by a visual camera by_time_ vio _data.x () and_time_ vio _data.y (), respectively representing an XY axis position coordinate of an RTK by_time_pos_data.x () and_time_pos_data.y (), respectively representing a position error of the XY axis by vio _dp.x () and vio _dp.y (), respectively representing a Euclidean distance corresponding to the position error of the XY axis by_ vio _dp, a heading angle output by the visual camera by pos_yaw, an RTK heading angle by ori_yaw, and a 90-degree heading by M_PI, wherein, the position error of_ vio _dp' is a transverse position error.
After the transverse position errors are obtained, a certain fitting algorithm can be adopted to fit a plurality of transverse position errors to obtain a transverse position error model, and then when second positioning data of the vision camera are obtained, the transverse position error model can be used for compensating the transverse position in the second positioning data of the vision camera, and then the compensated transverse position observation value is used for fusion positioning, so that the accuracy of transverse position information fusion is improved.
The embodiment of the application further provides a fusion positioning device 200 of an autopilot vehicle, as shown in fig. 2, and a schematic structural diagram of the fusion positioning device of the autopilot vehicle in the embodiment of the application is provided, where the device 200 includes: an acquisition unit 210, a calibration unit 220, a compensation unit 230 and a fusion positioning unit 240, wherein:
an acquisition unit 210 for acquiring first positioning data of a plurality of sensors of the autonomous vehicle;
the calibration unit 220 is configured to calibrate each sensor according to the first positioning data of each sensor by using a preset calibration policy, so as to obtain a positioning error model of each sensor;
the compensation unit 230 is configured to compensate the second positioning data of each sensor by using the positioning error model of each sensor, so as to obtain compensated second positioning data of each sensor;
and the fusion positioning unit 240 is configured to perform fusion positioning according to the compensated second positioning data of each sensor, so as to obtain a fusion positioning result of the automatic driving vehicle.
In some embodiments of the present application, the obtaining unit 210 is specifically configured to: acquiring current state data of an automatic driving vehicle, and determining whether the automatic driving vehicle meets a preset calibration condition according to the current state data; and under the condition that the automatic driving vehicle meets the preset calibration condition, acquiring first positioning data of a plurality of sensors of the automatic driving vehicle.
In some embodiments of the present application, the obtaining unit 210 is specifically configured to: acquiring the current speed of the automatic driving vehicle, RTK positioning data and IMU original data; determining whether the automatic driving vehicle is in a motion state according to the current speed and/or the IMU original data, determining whether an RTK positioning state is in a fixed solution state according to the RTK positioning data, and determining whether the automatic driving vehicle is in a straight-going state according to the IMU original data; determining that the autonomous vehicle meets the preset calibration condition when the autonomous vehicle is in a motion state, the RTK positioning state is a fixed solution state and the autonomous vehicle is in a straight-going state; otherwise, determining that the automatic driving vehicle does not meet the preset calibration condition.
In some embodiments of the present application, the positioning error model includes a heading angle error model, the first positioning data of each sensor includes an original heading angle corresponding to each sensor, and the calibration unit 220 is specifically configured to: calculating a reference course angle corresponding to each sensor according to a preset distance condition; and determining a course angle error model of each sensor according to the original course angle corresponding to each sensor and the reference course angle corresponding to each sensor.
In some embodiments of the present application, the positioning error model includes a velocity error model, the first positioning data of each sensor includes a raw velocity corresponding to each sensor, and the calibration unit 220 is specifically configured to: acquiring RTK positioning data corresponding to each sensor, wherein the RTK positioning data comprise RTK positioning speed; and determining a speed error model corresponding to each sensor according to the original speed corresponding to each sensor and the corresponding RTK positioning speed.
In some embodiments of the present application, the apparatus further comprises: and the updating unit is used for updating the positioning error models of the sensors according to preset updating conditions to obtain updated positioning error models of the sensors, wherein the preset updating conditions comprise preset distance intervals and/or preset time intervals.
In some embodiments of the present application, the positioning error model includes a lateral position error model, the first positioning data of each sensor includes original position information corresponding to the vision sensor, and the calibration unit 220 is specifically configured to: acquiring RTK positioning data, wherein the RTK positioning data comprises an RTK positioning position; determining a total position error corresponding to the vision sensor according to the original position information corresponding to the vision sensor and the RTK positioning position, wherein the total position error comprises a transverse position error and a longitudinal position error; eliminating longitudinal position errors in the total position errors corresponding to the visual sensor by using a preset eliminating strategy to obtain transverse position errors corresponding to the visual sensor; and determining the transverse position error model according to the transverse position error corresponding to the visual sensor.
It can be understood that the above-mentioned fusion positioning device for an automatic driving vehicle can implement each step of the fusion positioning method for an automatic driving vehicle provided in the foregoing embodiment, and the relevant explanation about the fusion positioning method for an automatic driving vehicle is applicable to the fusion positioning device for an automatic driving vehicle, which is not described herein again.
Fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application. Referring to fig. 3, at the hardware level, the electronic device includes a processor, and optionally an internal bus, a network interface, and a memory. The Memory may include a Memory, such as a Random-Access Memory (RAM), and may further include a non-volatile Memory (non-volatile Memory), such as at least 1 disk Memory. Of course, the electronic device may also include hardware required for other services.
The processor, network interface, and memory may be interconnected by an internal bus, which may be an ISA (Industry Standard Architecture ) bus, a PCI (Peripheral Component Interconnect, peripheral component interconnect standard) bus, or EISA (Extended Industry Standard Architecture ) bus, among others. The buses may be classified as address buses, data buses, control buses, etc. For ease of illustration, only one bi-directional arrow is shown in FIG. 3, but not only one bus or type of bus.
And the memory is used for storing programs. In particular, the program may include program code including computer-operating instructions. The memory may include memory and non-volatile storage and provide instructions and data to the processor.
The processor reads the corresponding computer program from the nonvolatile memory into the memory and then runs the computer program to form the fusion positioning device of the automatic driving vehicle on a logic level. The processor is used for executing the programs stored in the memory and is specifically used for executing the following operations:
acquiring first positioning data of a plurality of sensors of the autonomous vehicle;
calibrating each sensor by using a preset calibration strategy according to the first positioning data of each sensor to obtain a positioning error model of each sensor;
compensating the second positioning data of each sensor by using the positioning error model of each sensor to obtain compensated second positioning data of each sensor;
and carrying out fusion positioning according to the compensated second positioning data of each sensor to obtain a fusion positioning result of the automatic driving vehicle.
The method performed by the fusion positioning device of the autonomous vehicle disclosed in the embodiment shown in fig. 1 of the present application may be applied to a processor or implemented by the processor. The processor may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or by instructions in the form of software. The processor may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; but also digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present application may be embodied directly in hardware, in a decoded processor, or in a combination of hardware and software modules in a decoded processor. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in a memory, and the processor reads the information in the memory and, in combination with its hardware, performs the steps of the above method.
The electronic device may further execute the method executed by the fusion positioning device of the autopilot vehicle in fig. 1, and implement the function of the fusion positioning device of the autopilot vehicle in the embodiment shown in fig. 1, which is not described herein.
The embodiments of the present application also provide a computer-readable storage medium storing one or more programs, where the one or more programs include instructions, which when executed by an electronic device that includes a plurality of application programs, enable the electronic device to perform a method performed by a fusion positioning device for an autonomous vehicle in the embodiment shown in fig. 1, and specifically are configured to perform:
acquiring first positioning data of a plurality of sensors of the autonomous vehicle;
calibrating each sensor by using a preset calibration strategy according to the first positioning data of each sensor to obtain a positioning error model of each sensor;
compensating the second positioning data of each sensor by using the positioning error model of each sensor to obtain compensated second positioning data of each sensor;
and carrying out fusion positioning according to the compensated second positioning data of each sensor to obtain a fusion positioning result of the automatic driving vehicle.
It will be appreciated by those skilled in the art that embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The foregoing is merely exemplary of the present application and is not intended to limit the present application. Various modifications and changes may be made to the present application by those skilled in the art. Any modifications, equivalent substitutions, improvements, etc. which are within the spirit and principles of the present application are intended to be included within the scope of the claims of the present application.

Claims (10)

1. A fusion positioning method of an autonomous vehicle, wherein the method comprises:
acquiring first positioning data of a plurality of sensors of the autonomous vehicle;
calibrating each sensor by using a preset calibration strategy according to the first positioning data of each sensor to obtain a positioning error model of each sensor;
Compensating the second positioning data of each sensor by using the positioning error model of each sensor to obtain compensated second positioning data of each sensor;
and carrying out fusion positioning according to the compensated second positioning data of each sensor to obtain a fusion positioning result of the automatic driving vehicle.
2. The method of claim 1, wherein the acquiring first positioning data for a plurality of sensors of an autonomous vehicle comprises:
acquiring current state data of an automatic driving vehicle, and determining whether the automatic driving vehicle meets a preset calibration condition according to the current state data;
and under the condition that the automatic driving vehicle meets the preset calibration condition, acquiring first positioning data of a plurality of sensors of the automatic driving vehicle.
3. The method of claim 2, wherein the determining whether the autonomous vehicle meets a preset calibration condition based on the current state data comprises:
acquiring the current speed of the automatic driving vehicle, RTK positioning data and IMU original data;
determining whether the automatic driving vehicle is in a motion state according to the current speed and/or the IMU original data, determining whether an RTK positioning state is in a fixed solution state according to the RTK positioning data, and determining whether the automatic driving vehicle is in a straight-going state according to the IMU original data;
Determining that the autonomous vehicle meets the preset calibration condition when the autonomous vehicle is in a motion state, the RTK positioning state is a fixed solution state and the autonomous vehicle is in a straight-going state;
otherwise, determining that the automatic driving vehicle does not meet the preset calibration condition.
4. The method of claim 1, wherein the positioning error model includes a heading angle error model, the first positioning data of each sensor includes a raw heading angle corresponding to each sensor, the calibrating each sensor with a preset calibration strategy according to the first positioning data of each sensor, and obtaining the positioning error model of each sensor includes:
calculating a reference course angle corresponding to each sensor according to a preset distance condition;
and determining a course angle error model of each sensor according to the original course angle corresponding to each sensor and the reference course angle corresponding to each sensor.
5. The method of claim 1, wherein the positioning error model includes a velocity error model, the first positioning data of each sensor includes a raw velocity corresponding to each sensor, and calibrating each sensor according to the first positioning data of each sensor using a preset calibration strategy to obtain the positioning error model of each sensor includes:
Acquiring RTK positioning data corresponding to each sensor, wherein the RTK positioning data comprise RTK positioning speed;
and determining a speed error model corresponding to each sensor according to the original speed corresponding to each sensor and the corresponding RTK positioning speed.
6. The method of claim 1, wherein after calibrating each sensor using a preset calibration strategy based on the first positioning data of each sensor to obtain a positioning error model of each sensor, the method further comprises:
updating the positioning error model of each sensor according to preset updating conditions to obtain an updated positioning error model of each sensor, wherein the preset updating conditions comprise preset distance intervals and/or preset time intervals.
7. The method of claim 1, wherein the positioning error model includes a lateral position error model, the first positioning data of each sensor includes original position information corresponding to the vision sensor, the calibrating each sensor with a preset calibration strategy according to the first positioning data of each sensor, and obtaining the positioning error model of each sensor includes:
acquiring RTK positioning data, wherein the RTK positioning data comprises an RTK positioning position;
Determining a total position error corresponding to the vision sensor according to the original position information corresponding to the vision sensor and the RTK positioning position, wherein the total position error comprises a transverse position error and a longitudinal position error;
eliminating longitudinal position errors in the total position errors corresponding to the visual sensor by using a preset eliminating strategy to obtain transverse position errors corresponding to the visual sensor;
and determining the transverse position error model according to the transverse position error corresponding to the visual sensor.
8. A fusion positioning device for an autonomous vehicle, wherein the device comprises:
an acquisition unit for acquiring first positioning data of a plurality of sensors of an autonomous vehicle;
the calibration unit is used for calibrating each sensor by utilizing a preset calibration strategy according to the first positioning data of each sensor to obtain a positioning error model of each sensor;
the compensation unit is used for respectively compensating the second positioning data of each sensor by utilizing the positioning error model of each sensor to obtain compensated second positioning data of each sensor;
and the fusion positioning unit is used for carrying out fusion positioning according to the compensated second positioning data of each sensor to obtain a fusion positioning result of the automatic driving vehicle.
9. An electronic device, comprising:
a processor; and
a memory arranged to store computer executable instructions which, when executed, cause the processor to perform the method of any of claims 1 to 7.
10. A computer readable storage medium storing one or more programs, which when executed by an electronic device comprising a plurality of application programs, cause the electronic device to perform the method of any of claims 1-7.
CN202310027780.8A 2023-01-09 2023-01-09 Fusion positioning method and device for automatic driving vehicle and electronic equipment Pending CN116222586A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310027780.8A CN116222586A (en) 2023-01-09 2023-01-09 Fusion positioning method and device for automatic driving vehicle and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310027780.8A CN116222586A (en) 2023-01-09 2023-01-09 Fusion positioning method and device for automatic driving vehicle and electronic equipment

Publications (1)

Publication Number Publication Date
CN116222586A true CN116222586A (en) 2023-06-06

Family

ID=86570695

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310027780.8A Pending CN116222586A (en) 2023-01-09 2023-01-09 Fusion positioning method and device for automatic driving vehicle and electronic equipment

Country Status (1)

Country Link
CN (1) CN116222586A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116518986A (en) * 2023-07-04 2023-08-01 蘑菇车联信息科技有限公司 Positioning method and device for automatic driving vehicle, electronic equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116518986A (en) * 2023-07-04 2023-08-01 蘑菇车联信息科技有限公司 Positioning method and device for automatic driving vehicle, electronic equipment and storage medium
CN116518986B (en) * 2023-07-04 2023-10-03 蘑菇车联信息科技有限公司 Positioning method and device for automatic driving vehicle, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN111947671B (en) Method, apparatus, computing device and computer-readable storage medium for positioning
US10267638B2 (en) Method and system for adapting a navigation system
CN115390086B (en) Fusion positioning method and device for automatic driving, electronic equipment and storage medium
CN114279453B (en) Automatic driving vehicle positioning method and device based on vehicle-road cooperation and electronic equipment
CN114777814A (en) Fusion positioning precision evaluation method, device and system based on vehicle road cloud
CN116222586A (en) Fusion positioning method and device for automatic driving vehicle and electronic equipment
CN115184976B (en) Positioning method and device for automatic driving vehicle, electronic equipment and storage medium
CN115546315A (en) Sensor on-line calibration method and device for automatic driving vehicle and storage medium
CN115727843A (en) Wheel speed determination method, device and equipment for dead reckoning
CN115056801A (en) Multipath recognition method and device for automatic driving, electronic equipment and storage medium
CN114114369A (en) Autonomous vehicle positioning method and apparatus, electronic device, and storage medium
CN113932835B (en) Calibration method and device for positioning lever arm of automatic driving vehicle and electronic equipment
CN115950441B (en) Fusion positioning method and device for automatic driving vehicle and electronic equipment
CN116148821A (en) Laser radar external parameter correction method and device, electronic equipment and storage medium
WO2023185215A1 (en) Data calibration
CN116106869A (en) Positioning evaluation method and device for automatic driving vehicle and electronic equipment
CN115014395A (en) Real-time calibration method and device for vehicle course angle for automatic driving
CN116448146A (en) Inertial navigation system self-calibration method, device, equipment and storage medium
CN113917512B (en) Positioning method and device for automatic driving vehicle, electronic equipment and storage medium
CN114739416A (en) Automatic driving vehicle positioning method and device, electronic equipment and storage medium
CN115112125A (en) Positioning method and device for automatic driving vehicle, electronic equipment and storage medium
CN115183786A (en) Training method and device of sensor error prediction model for automatic driving
CN116518986B (en) Positioning method and device for automatic driving vehicle, electronic equipment and storage medium
CN115685280A (en) Attitude optimization method and device for automatic driving vehicle and electronic equipment
CN116559899B (en) Fusion positioning method and device for automatic driving vehicle and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination