CN114964270B - Fusion positioning method, device, vehicle and storage medium - Google Patents

Fusion positioning method, device, vehicle and storage medium Download PDF

Info

Publication number
CN114964270B
CN114964270B CN202210540746.6A CN202210540746A CN114964270B CN 114964270 B CN114964270 B CN 114964270B CN 202210540746 A CN202210540746 A CN 202210540746A CN 114964270 B CN114964270 B CN 114964270B
Authority
CN
China
Prior art keywords
fusion positioning
measurement source
current frame
data
source data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210540746.6A
Other languages
Chinese (zh)
Other versions
CN114964270A (en
Inventor
张丹
朱昊
李金珂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Uisee Technologies Beijing Co Ltd
Original Assignee
Uisee Technologies Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Uisee Technologies Beijing Co Ltd filed Critical Uisee Technologies Beijing Co Ltd
Priority to CN202210540746.6A priority Critical patent/CN114964270B/en
Publication of CN114964270A publication Critical patent/CN114964270A/en
Application granted granted Critical
Publication of CN114964270B publication Critical patent/CN114964270B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C22/00Measuring distance traversed on the ground by vehicles, persons, animals or other moving solid bodies, e.g. using odometers, using pedometers

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Image Analysis (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

The embodiment of the invention discloses a fusion positioning method, a fusion positioning device, a vehicle and a storage medium, comprising the following steps: acquiring odometer data and measurement source data of at least one path of measurement sources when the fusion positioning event is triggered; screening out target measurement sources with effective time confidence according to the time stamp in the measurement source data, updating target measurement source data corresponding to the target measurement sources through odometer data, and generating current frame measurement source data of each path of target measurement sources; judging whether a target measurement source corresponding to the current frame measurement source data participates in fusion positioning or not according to the fusion positioning result of the current frame measurement source data and the previous frame; and for the target measurement sources participating in fusion positioning, a set filter is adopted to generate a current frame fusion positioning result based on the corresponding current frame measurement source data. According to the embodiment, the target measurement source data is updated through the odometer data, a more accurate fusion positioning result can be obtained, and the fusion positioning result is generated by adopting a set filter, so that the robustness of fusion positioning is further improved.

Description

Fusion positioning method, device, vehicle and storage medium
Technical Field
The invention relates to the technical field of automatic driving, in particular to a fusion positioning method, a fusion positioning device, a vehicle and a storage medium.
Background
The fusion positioning is to combine a plurality of positioning technologies according to characteristics and application requirements, and to fuse a plurality of positioning information through a specific algorithm to realize mutual promotion, so that the positioning performance and effect are improved.
In the current automatic driving scene, positioning information in the running process of the vehicle is acquired by adopting a plurality of sensors, and the positioning information acquired by the plurality of sensors is fused by a Kalman filter, so that the high-precision positioning of the vehicle is realized.
The inventor finds that the time stamps of the positioning information of the multipath sensors in the related fusion positioning technology are different in the process of realizing the invention, so that the positioning information with high precision cannot be obtained.
Disclosure of Invention
The invention provides a fusion positioning method, a fusion positioning device, a vehicle and a storage medium, which can improve the accuracy of fusion positioning results.
According to an aspect of the present invention, there is provided a fusion positioning method including: acquiring odometer data and measurement source data of at least one path of measurement sources when the fusion positioning event is triggered;
Screening out target measurement sources with effective time confidence according to the time stamp in the measurement source data, updating target measurement source data corresponding to the target measurement sources through the odometer data, and generating current frame measurement source data of each path of target measurement sources;
Judging whether a target measurement source corresponding to the current frame measurement source data participates in fusion positioning or not according to the fusion positioning result of the current frame measurement source data and the previous frame;
and for the target measurement sources participating in fusion positioning, a set filter is adopted to generate a current frame fusion positioning result based on the corresponding current frame measurement source data.
Optionally, after generating the current frame measurement source data of each path of target measurement source, the method further includes:
Clustering the target measurement sources based on current frame measurement source data of each path of target measurement source, and attaching a clustering mark to each path of target measurement source according to a clustering result.
Optionally, the method further comprises:
determining a first measurement source which is successfully clustered in the target measurement sources according to the clustering mark;
Acquiring cluster center data of the first measurement source, and determining a first distance difference value and a first course difference value of the cluster center data and a first fusion positioning result of a current frame, wherein the first fusion positioning result of the current frame comprises position data and a course angle in the fusion positioning result of the current frame;
determining a starting time stamp of the fusion positioning deviation according to the first distance difference value and the first course angle difference value;
Recording the duration time of the fusion positioning deviation based on the starting time stamp, and determining the fusion positioning state as a biased state when the duration time meets a set condition;
And when the fusion positioning state is a biased state, determining that the first fusion positioning result of the current frame triggers the filter reset.
Optionally, after the first fused positioning result of the current frame triggers the filter reset, the method further includes:
and acquiring an average value of current frame measurement source data corresponding to each first measurement source, and updating the state quantity of the set filter according to the average value to realize filter reset.
Optionally, the method further comprises:
If the first measurement source which is successfully clustered does not exist in the target measurement sources according to the clustering mark, determining a second measurement source according to the time stamp corresponding to the target measurement source, and timing by taking the time stamp corresponding to the second measurement source as a timing starting point;
And if the first measurement sources which are clustered successfully do not exist within the set timeout time, updating the state quantity of the set filter according to the current frame measurement source data corresponding to the second measurement source so as to realize filter reset.
Optionally, after the resetting of the filter, the method further comprises:
Updating the last frame fusion positioning result according to the odometer data to obtain a current frame fusion positioning updating result;
And according to the current frame fusion positioning updating result and the last frame fusion positioning result, predicting and updating the state quantity of the set filter and the variance of the prediction error.
Optionally, the generating, by using a setting filter, a current frame fusion positioning result based on the corresponding current frame measurement source data includes:
Acquiring the weight of each path of target measurement source according to the confidence coefficient in the current frame measurement source data participating in fusion positioning and a preset weight;
Obtaining the observed quantity of the set filter according to the weight of each path of target measurement source and the corresponding current frame measurement source data;
And generating a current frame fusion positioning result according to the obtained observed quantity of the setting filter and the predicted and updated state quantity.
Optionally, after obtaining the observed quantity of the setting filter according to the weight of each path of target measurement source and the corresponding current frame measurement source data, the method further includes:
And according to the variance of the prediction error after prediction updating and the average value of the observation error of the setting filter, carrying out observation updating on the variance of the observation error of the setting filter.
Optionally, after obtaining the weight of each path of target measurement source, the method further includes:
Acquiring fusion positioning confidence coefficient of a fusion positioning result of the current frame according to the weight and the confidence coefficient of each target measurement source participating in fusion;
and determining the fusion positioning state of the fusion positioning result of the current frame according to the fusion positioning confidence.
Optionally, the method further comprises:
If no target measurement source participates in fusion positioning, the current frame fusion positioning updating result is used as the current frame fusion positioning result, and the fusion positioning state is updated into a pure motion estimation state.
Optionally, after updating the fused localization state to the pure motion estimation state, further comprising:
And acquiring the duration time or the duration distance of the pure motion estimation state, and judging whether the fusion positioning result of the current frame triggers the filter reset or not based on the duration time or the duration distance.
Optionally, the updating, by the odometer data, the target measurement source data corresponding to the target measurement source, and generating current frame measurement source data of each path of target measurement source includes:
Determining a timestamp difference value according to the current timestamp and the timestamp in the target measurement source data;
and generating a compensation coefficient according to the timestamp difference value, the odometer data and the vehicle wheelbase, updating target measurement source data corresponding to each path of target measurement source according to the compensation coefficient, and generating current frame measurement source data of each path of target measurement source.
Optionally, the determining, according to the fusion positioning result of the current frame measurement source data and the previous frame, whether the target measurement source corresponding to the current frame measurement source data participates in fusion positioning includes:
determining a second distance difference value and a second course angle difference value of the fusion positioning result of the current frame measurement source data of each path of target measurement source and the previous frame;
Acquiring a preset first group of thresholds and a preset second group of thresholds, wherein the second group of thresholds is larger than the first group of thresholds, and each group of thresholds comprises a distance threshold and a heading threshold;
for each path of target measurement source, determining that the corresponding target measurement source participates in fusion positioning when the second distance difference value and the second course angle difference value are smaller than the first group of threshold values;
comparing the second distance difference and the second heading angle difference with the second set of thresholds when the second distance difference or the second heading angle difference is equal to or greater than the first set of thresholds;
and if the second distance difference value and the second course angle difference value are smaller than the second group of threshold values, determining that the corresponding target measuring source participates in fusion positioning.
Optionally, after generating the current frame fusion positioning result based on the corresponding current frame measurement source data by using the setting filter, the method further includes:
Determining a third distance difference value and a third course angle difference value of current frame measurement source data of each path of target measurement source and a current frame second fusion positioning result, wherein the current frame second fusion positioning result comprises position data and a course angle in the current frame fusion positioning result;
Judging whether each path of target measuring source is in an abnormal state according to the third distance difference value and the third course angle difference value;
If yes, the abnormal state information is sent to the corresponding target measuring source so as to instruct the corresponding target measuring source to reset.
According to another aspect of the present invention, there is provided a fusion positioning device comprising: the data acquisition module is used for acquiring odometer data and measurement source data of at least one path of measurement sources when the fusion positioning event is triggered;
The data updating module is used for screening out target measurement sources with effective time confidence according to the time stamp in the measurement source data, updating the target measurement source data corresponding to the target measurement sources through the odometer data and generating current frame measurement source data of each path of target measurement sources;
the participation fusion judging module is used for judging whether a target measurement source corresponding to the current frame measurement source data participates in fusion positioning or not according to the fusion positioning result of the current frame measurement source data and the previous frame;
and the fusion positioning module is used for generating a current frame fusion positioning result based on corresponding current frame measurement source data by adopting a set filter for the target measurement sources participating in fusion positioning.
According to another aspect of the present invention, there is provided a vehicle including: at least one measurement source for providing measurement source data during travel of the vehicle; an odometer for providing odometer data during travel of the vehicle;
At least one processor; and
A memory communicatively coupled to the at least one processor; wherein,
The memory stores a computer program executable by the at least one processor to enable the at least one processor to perform a fusion positioning method according to any one of the embodiments of the present invention.
According to another aspect of the present invention, there is provided a computer readable storage medium storing computer instructions for causing a processor to execute a fusion positioning method according to any embodiment of the present invention.
According to the technical scheme, the target measurement source data can be updated by using the odometer data to obtain more accurate fusion positioning, and meanwhile, whether the target measurement source corresponding to the current frame measurement source data participates in fusion positioning is judged by the fusion positioning result of the current frame measurement source data and the previous frame measurement source data, so that the robustness of fusion positioning can be improved, and the problem that the related fusion positioning technology in the prior art cannot obtain high-precision positioning information is solved.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a fusion positioning method provided according to an embodiment of the present invention;
FIG. 2 is a flow chart of a fusion positioning method provided according to an embodiment of the present invention;
FIG. 3 is a flow chart of a fusion positioning method provided according to an embodiment of the present invention;
FIG. 4 is a flow chart of a fusion positioning method provided according to an embodiment of the present invention;
Fig. 5 is a schematic structural diagram of a fusion positioning device according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a vehicle implementing a fusion positioning method according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Fig. 1 is a flowchart of a fusion positioning method according to an embodiment of the present invention, where the method may be applied to an automatic driving situation, and the method may be performed by a fusion positioning device, where the fusion positioning device may be implemented in a hardware and/or software form, and the fusion positioning device may be integrally configured in a vehicle. As shown in fig. 1, the method includes:
S110, acquiring odometer data and measurement source data of at least one measurement source when the fusion positioning event is triggered.
In the embodiment of the invention, after the fusion positioning event is triggered, the fusion positioning is executed. The triggering condition of the fusion positioning event can be various, and the embodiment of the invention is not particularly limited. For example, the fused position fix may be triggered periodically by a timer. Or the time interval of fusion positioning with the previous frame meets the set time interval, and the fusion positioning is triggered. Or periodically trigger fusion positioning.
In the embodiment of the invention, a timer is utilized to trigger fusion positioning events at intervals of set time. Alternatively, the trigger time of the fused positioning event may be determined as a fused positioning timestamp of the current frame, where the fused positioning timestamp coincides with the time of the timer. The triggering of the fusion positioning event of the front frame and the rear frame keeps the same time interval, alternatively, the time interval can be set as a first time threshold, and the parameter value of the first time threshold is not particularly limited in this embodiment, so that flexible configuration can be performed according to an actual application scene. For example, the first time threshold is a threshold of the fused positioning frame rate, and the value can be 10 ms-50 ms.
In the embodiment of the invention, in the mode of triggering fusion positioning by using the time interval of fusion positioning with the previous frame, the fusion positioning end time of the previous frame is used as a timing starting point, and when the timing time meets the set time interval, the fusion positioning is triggered.
In the method for periodically triggering fusion positioning in the embodiment of the invention, the fusion positioning is periodically triggered according to the preset triggering period value.
The odometer data can be data obtained by measuring or calculating the odometer, and the odometer data obtained by the method can comprise a time stamp, a vehicle speed, front wheel steering angle data and the like.
The measurement source data may be data measured or calculated by a measurement source, and the measurement source data may include data such as a time stamp, an east position, a north position, an elevation, a pitch angle, a roll angle, a heading angle, a confidence level, and the like. The measurement source may be various available measurement source devices in a vehicle for positioning, such as GPS (Global Posit ioning System ), visual SLAM (Simultaneous Localization AND MAPPING, synchronous positioning and mapping), laser SLAM positioning, visual semantic positioning, and laser semantic positioning, so long as the positioning function can be implemented on the vehicle, and the embodiment of the present invention does not limit the type and number of measurement sources.
And S120, screening out target measurement sources with effective time confidence according to the time stamp in the measurement source data, and updating target measurement source data corresponding to the target measurement sources through the odometer data to generate current frame measurement source data of each path of target measurement sources.
The time confidence is used for judging whether the measurement source can participate in fusion positioning or not, and the measurement source with effective time confidence can participate in fusion positioning. The time stamp of the measurement source and the fused positioning time stamp can be compared, and the time confidence state of the corresponding measurement source is determined according to the comparison result. Specifically, if the difference between the timestamp of a certain measurement source and the fused positioning timestamp of the current frame is smaller than the second time threshold and the confidence coefficient of the measurement source is larger than the first confidence coefficient threshold, determining that the fused positioning of the current frame is valid, otherwise determining that the time confidence coefficient of the measurement source is invalid. It should be noted that, in this embodiment, both the second time threshold and the first confidence threshold may be set as required. For example, the second time threshold is a threshold for judging whether the measurement source is valid, and the value may be 20ms to 1000ms. The first confidence threshold is a threshold for judging whether the measurement source is effective, and the value can be 0.2-0.4.
In the embodiment of the invention, the measuring source with effective time confidence is set as the target measuring source, and the target measuring source and the fusion positioning time stamp are inconsistent, and the odometer and the measuring source are devices in the vehicle, so that the odometer data can be used for positioning the vehicle, and the measuring source data can be used for directly acquiring the positioning data of the vehicle, thereby the target measuring source data of the target measuring source with effective time confidence can be compensated by adopting the odometer data, and the target measuring source data of the target measuring source can be aligned to the fusion positioning time stamp.
According to the embodiment of the invention, the target measurement source data can be updated by using the odometer data, so that more accurate fusion positioning data can be obtained.
Illustratively, updating the target measurement source data corresponding to the target measurement sources by the odometry data may include: determining a time stamp difference value according to the current time stamp and the time stamp in the target measurement source data; and generating a compensation coefficient according to the timestamp difference value, the odometer data and the vehicle wheelbase, updating the target measurement source data corresponding to each path of target measurement source according to the compensation coefficient, and generating the current frame measurement source data of each path of target measurement source.
The fusion positioning time stamp of the current frame is simply called as the current time stamp, and the time stamp difference value between the current time stamp and the time stamp of the target measurement source with effective time confidence is calculated.
The compensation coefficient is used for performing time compensation on the target measurement source data of the current frame to obtain the measurement source data of the current frame. The compensation coefficients include a first compensation coefficient D, a second compensation coefficient R, a third compensation coefficient C x, and a fourth compensation coefficient C y. The first compensation coefficient D may be generated based on the current frame odometer speed and the above-described timestamp difference. The second compensation coefficient R may be generated based on the vehicle wheelbase and the current frame odometer front wheel steering angle. The third supplementary coefficient C x may be generated based on the east position included in the target measurement source data of the current frame, the second compensation coefficient R, and the heading angle included in the target measurement source data of the current frame. And, a fourth compensation coefficient C y may be generated based on the north position included in the target measurement source data of the current frame, the second compensation coefficient R, and the target measurement source data of the current frame.
Under the circumstance, updating the target measurement source data corresponding to each path of target measurement source according to the compensation coefficient to generate the current frame measurement source data of each path of target measurement source may specifically include: updating the east position x m, the north position y m, the elevation z m and the pitch angle corresponding to each path of target measuring source according to the first compensation coefficient D, the second compensation coefficient R, the third compensation coefficient C x and the fourth compensation coefficient C y The roll angle phi m and the heading angle theta m generate each current frame measurement source data of each path of target measurement source compensated by the mileage data.
Optionally, in the embodiment of the present invention, the current frame measurement source data is specifically generated according to the following formula:
Δt=ti-tm; (1.1)
D=Vi*Δt; (1.2)
R=L/tanSi; (1.3)
Cx=xm-R*sinθm; (1.4)
Cy=ym+R*cosθm; (1.5)
θ'm=θm+D/R; (1.6)
x'm=Cx+D*cosθ'm; (1.7)
y'm=Cy+D*sinθ'm; (1.8)
z'm=zm; (1.9)
φ'm=φm; (1.11)
Wherein t i is the current timestamp; t m is a time stamp in the target measurement source data; v i is the current frame odometer speed; s i is the front wheel corner of the current frame odometer; l is the wheelbase of the vehicle; x m is the current frame measurement source data before updating, and comprises the east position X m, the north position y m, the elevation z m and the pitch angle of the target measurement source Roll angle phi m and heading angle theta m;X'm are updated current frame measurement source data, and comprise the east position x ' m, the north position y ' m, the elevation z ' m and the pitch angle/>, of the target measurement sourceRoll angle phi 'm and heading angle theta' m.
S130, judging whether a target measurement source corresponding to the current frame measurement source data participates in fusion positioning or not according to the fusion positioning result of the current frame measurement source data and the previous frame.
Because in the embodiment of the invention, the fusion positioning is triggered periodically, and the fusion positioning result of the previous frame is positioning information obtained by fusing the measurement source data of the previous frame before the fusion positioning of the measurement source data of the current frame.
It should be noted that, the target measurement source with effective time confidence may not participate in the fusion positioning for some reasons, and it needs to determine whether the target measurement source corresponding to the current frame measurement source data may participate in the fusion positioning based on the east position, north position and heading angle in the current frame measurement source data and the corresponding parameters in the fusion positioning result of the previous frame.
Illustratively, S130 may include: determining a second distance difference value and a second course angle difference value of the fusion positioning result of the current frame measurement source data of each path of target measurement source and the previous frame; acquiring a preset first group of thresholds and a preset second group of thresholds, wherein the second group of thresholds is larger than the first group of thresholds, and each group of thresholds comprises a distance threshold and a heading threshold; for each path of target measurement source, when the second distance difference value and the second course angle difference value are smaller than the first group of threshold values, determining that the corresponding target measurement source participates in fusion positioning; comparing the second distance difference value and the second heading angle difference value with a second set of thresholds when the second distance difference value or the second heading angle difference value is equal to or greater than the first set of thresholds; and if the second distance difference value and the second course angle difference value are smaller than a second group of threshold values, determining that the corresponding target measuring source participates in fusion positioning.
In the implementation of the invention, the distance difference d m and the course angle difference delta theta m of the fusion positioning result of the current frame measurement source data and the previous frame are required to be calculated, and the distance difference d m and the course angle difference delta theta m are calculated by using the following formulas:
Δθm=|θm-θ|; (1.15)
wherein x m,ymm is the east position, north position and course angle included in the current frame measurement source data, and x, y and θ are the east position, north position and course angle included in the last frame fusion positioning result.
In the embodiment of the invention, whether the target measurement source corresponding to the current frame measurement source data participates in fusion positioning is judged through the distance difference d m and the course angle difference delta theta m. Specifically, when d m is smaller than the set first distance threshold and Δθ m is smaller than the set first heading threshold, determining that the target measurement source participates in fusion positioning, otherwise determining that the target measurement source does not participate in fusion positioning. If all the target measurement sources are determined not to participate in the fusion positioning, the following determination is continued. The first distance threshold is a first heavy distance threshold for determining that the measurement source with effective time confidence coefficient is fused, and the value can be 0.2 m-0.5 m. The first heading threshold is a first heavy heading threshold which is obtained by fusing measurement sources with effective time confidence, and the value can be 0.5 deg-1.0 deg.
And when d m is smaller than the set second distance threshold and delta theta m is smaller than the set second heading threshold, determining that the target measurement source participates in fusion positioning, otherwise, determining that the target measurement source does not participate in fusion positioning. The second distance threshold is a second distance threshold for determining that the measurement source with effective time confidence coefficient is fused, and the value can be 0.5 m-1.0 m. The second course threshold is a second course threshold obtained by fusing measurement sources for which the time confidence is judged to be effective, and the value can be 1.0 deg-2.0 deg.
The embodiment of the invention is provided with two groups of thresholds, namely a first group of thresholds comprises a first distance threshold and a first heading threshold, a second group of thresholds comprises a second distance threshold and a second heading threshold, the second distance threshold of the second group of thresholds is larger than the first distance threshold of the first group of thresholds, and the second heading threshold of the second group of thresholds is larger than the first heading threshold. And judging whether each path of target measurement source participates in comprehensive positioning or not through two groups of thresholds with different values, so that more measurement sources can participate in fusion positioning, and the robustness of the fusion positioning is improved.
S140, for the target measurement sources participating in fusion positioning, a set filter is adopted to generate a current frame fusion positioning result based on corresponding current frame measurement source data.
The setting filter may be a filter capable of fusing data of a plurality of measurement sources to obtain positioning information, and the selection of the filter in the embodiment of the present invention is not particularly limited.
The current frame fusion positioning result is positioning information generated after the current frame measurement source data are fused.
In the embodiment of the invention, after the fusion positioning event is triggered, the fusion positioning data and the measurement source state data can be output, wherein the fusion positioning data can comprise: timestamp, east position, north position, elevation, pitch angle, roll angle, heading angle, confidence, fused positioning state data, etc., specifically, the fused positioning state data may include an unset success state, a high-precision positioning state, a low-precision positioning state, a pure motion estimation state, a biased state, etc.; measuring source state data may include: time confidence valid/invalid, engaged/not engaged fusion, abnormal/normal, etc.
In the embodiment of the invention, a Kalman filter can be adopted to generate the fusion positioning result of the current frame, and the system equation of the Kalman filter is as follows:
Xt=Xt-1+ut+wt-1; (1.16)
Zt=Xt+vt; (1.17)
Wherein, the state quantity X of the filter is a fusion positioning result, and the fusion positioning result comprises an east position X, a north position y, an elevation z and a pitch angle Roll angle/>And a heading angle θ. The observed quantity Z of the filter is current frame measurement source data participating in fusion positioning, and comprises an east position x m, a north position y m, an elevation Z m and a pitch angle/>The roll angle phi m and the course angle theta m;Xt are fusion positioning results of the Kalman filter corresponding to the current frame; z t is the observed quantity of the Kalman filter corresponding to the current frame; x t-1 is the fusion positioning result of the previous frame, and u t is the increment of motion update of the target measurement source of the current frame based on the mileage data; w t-1 is the filter prediction error of the previous frame fusion positioning, and v t is the filter observation error of the current frame fusion positioning.
Alternatively, other types of filters may be used in embodiments of the present invention, where the system equations of the filters may change.
In the embodiment of the invention, in the initializing link of the filter, the mean value of the prediction error w and the observation error v is set as the parameter matrixes Q and R, the variance of the prediction error w and the observation error v is set as P - and P, and the initialization is 0.
Under the condition that the fusion positioning state of the current frame meets the set condition, resetting the filter before fusion positioning is carried out on the measurement source data of the next frame corresponding to each path of measurement source participating in fusion positioning.
According to the embodiment of the invention, more accurate fusion positioning can be obtained by updating the target measurement source data by using the odometer data, and the robustness of fusion positioning can be improved by judging whether the target measurement source corresponding to the current frame measurement source data participates in fusion positioning according to the fusion positioning result of the current frame measurement source data and the previous frame.
Fig. 2 is a flowchart of another fusion positioning method provided in the embodiment of the present invention, where after generating current frame measurement source data of each path of target measurement source in the step, the embodiment of the present invention further includes a step of clustering the target measurement sources, adding a cluster mark to each path of target measurement source according to a clustering result, determining a fusion positioning state based on a cluster center, and triggering a filter to reset when the fusion positioning state is in a biased state. As shown in fig. 2, the method includes:
And S210, acquiring odometer data and measurement source data of at least one measurement source when the fusion positioning event is triggered.
S220, screening out target measurement sources with effective time confidence according to the time stamps in the measurement source data, updating target measurement source data corresponding to the target measurement sources through the odometer data, and generating current frame measurement source data of each path of target measurement sources.
S230, clustering the target measurement sources based on the current frame measurement source data of each path of target measurement source, and attaching a clustering mark to each path of target measurement source according to a clustering result.
It should be noted that, some or all data in the current frame measurement source data of each path of target measurement source may be used to cluster the target measurement sources, and specifically, which data to use for clustering may be freely set according to the actual application scenario, and the embodiment of the present invention is not limited specifically.
Optionally, in the embodiment of the present invention, the east position, the north position and the course angle in the current frame measurement source data of each path of target measurement source are adopted to cluster the target measurement sources.
In the embodiment of the invention, the following formula can be adopted to calculate the distance difference dij and the heading difference delta theta ij between the target measurement sources:
Δθij=|θij|; (2.2)
Wherein x i,yii is the east position, north position and course angle included in the current frame measurement source data corresponding to measurement source i, and x j,yjj is the east position, north position and course angle included in the current frame measurement source data corresponding to measurement source j.
It should be noted that, the measurement source i and the measurement source j are any two measurement sources in each path of target measurement sources, including a measurement source participating in fusion positioning and a measurement source not participating in fusion positioning. In this embodiment, whether the target measurement source is clustered successfully can be determined by the value of d ij and the value of Δθ ij. For example, when d ij is less than the set third distance threshold and Δθ ij is less than the set third heading threshold, it is determined that measurement source i and measurement source j cluster successfully. Otherwise, determining that the clustering of the measurement source i and the measurement source j is unsuccessful. The third distance threshold is a clustering threshold of the measuring source based on distance data, and the value of the third distance threshold is 0.5 m-0.8 m. The third heading threshold value is a clustering threshold value of a measurement source based on heading data, and the value is 0.5 deg-1.5 deg.
And carrying out cluster analysis on the target measurement sources, and endowing effective cluster marks for the measurement sources with successful clusters, wherein the effective cluster marks can be embodied in a digital form, such as 1,2, … and the like, and the embodiment is not particularly limited to the method. The clustering marks of the target measurement sources which are clustered successfully are consistent. In addition, the target measurement sources that are not successfully clustered are assigned an invalid cluster signature, which may also be embodied in the form of a number, for example, the invalid cluster signature may be-1, which is not particularly limited in the embodiment of the present invention.
S240, judging whether a target measurement source corresponding to the current frame measurement source data participates in fusion positioning or not according to the fusion positioning result of the current frame measurement source data and the previous frame.
S250, for target measurement sources participating in fusion positioning, a set filter is adopted to generate a current frame fusion positioning result based on corresponding current frame measurement source data.
S260, determining a first measurement source which is successfully clustered in the target measurement sources according to the clustering mark.
For example, a first measurement source of the target measurement sources that is successfully clustered may be determined from the digital form of the cluster signature.
S270, acquiring cluster center data of a first measurement source, and determining a first distance difference value and a first heading difference value of a first fusion positioning result of the cluster center data and the current frame.
The first fusion positioning result of the current frame comprises position data and course angle in the fusion positioning result of the current frame.
The clustering center data is all or part of the clustering center data of the first measurement source which is successfully clustered. There are many ways to obtain the cluster center, and embodiments of the present invention are not limited in particular. For example, a geometric center of the current frame measurement source data corresponding to the first measurement source may be determined as a cluster center. Or determining the cluster center of the first measurement source by adopting a cluster analysis K-Means algorithm. And determining whether all data of the cluster center are selected as cluster center data or part of data of the cluster center are selected as cluster center data according to the data included in the first fusion positioning result of the current frame.
And acquiring an east position, a north position and a course angle contained in the fusion positioning result of the current frame to form a first fusion positioning result of the current frame. Correspondingly, the east position, the north position and the course angle of the clustering center are obtained to form clustering center data. And calculating a first distance difference value of a first fusion positioning result of the clustering center data and the current frame according to the east position and the north position. And calculating a first course difference value of the first fusion positioning result of the clustering center data and the current frame according to the course angle. Specifically, in the embodiment of the invention, a first distance difference d ci and a first heading difference Δθ ci between the clustering center data and the first fusion positioning result of the current frame are calculated by using the following formula:
wherein, The data are respectively the east position, the north position and the course angle included in the clustering center data, and x i,yii is respectively the east position, the north position and the course angle included in the first fusion positioning result of the current frame.
S280, determining a starting time stamp of the fusion positioning deviation according to the first distance difference value and the first course angle difference value.
The start time stamp of the fused positioning deviation refers to the time when the fused positioning deviation starts to meet the set condition.
For example, when the first distance difference d ci is greater than the set fourth distance threshold and the first heading difference Δθ ci is greater than the set fourth heading threshold, a fused positioning bias start time stamp t cs is recorded. The fourth distance threshold is a fusion positioning deviation distance threshold, and the value of the fourth distance threshold is 0.8 m-1.0 m. The fourth course threshold value is a fused positioning deviation course threshold value, and the value is 0.5 deg-1.5 deg.
S290, recording duration time of fusion positioning deviation based on the starting time stamp, and determining that the fusion positioning state is a biased state when the duration time meets the set condition.
The setting condition is a condition for determining whether the fusion positioning state is a biased state. A third time threshold may be set, and if the duration of the fused positioning deviation is greater than the third time threshold, it is determined that the duration satisfies the set condition. The third time threshold is a fusion positioning deviation time threshold, and the value can be 1 s-2 s.
For example, if the sustained occurrence first distance difference d ci is greater than the set fourth distance threshold and the first heading difference Δθ ci is greater than the set fourth heading threshold, then it is determined that the fused positioning bias is sustained. Recording the time when the fusion positioning deviation continuously appears, and obtaining the duration time of the fusion positioning deviation taking the starting time stamp as the timing starting point.
Specifically, in the embodiment of the present invention, the duration Δt is calculated using the following formula:
ΔT=ti-tcs; (2.6)
Where t i is the fusion positioning timestamp of the current frame.
When the duration deltat is greater than the third time threshold, the fusion localization is determined to be biased.
And S2100, when the fusion positioning state is in a biased state, determining that the first fusion positioning result of the current frame triggers the filter reset.
For example, if the fusion positioning state is in a biased state, determining that the first fusion positioning result of the current frame triggers the filter reset, implementing the filter reset based on the target measurement source data of the next frame, and fusing the measurement source data of the next frame through the filter with successful reset. Under the condition, the average value of the current frame measurement source data corresponding to each first measurement source is obtained, and the state quantity of the filter is updated and set according to the average value so as to reset the filter. It should be noted that, the current frame measurement source data herein is actually the target measurement source data of the next frame of the fusion positioning result that triggers the filter reset. Since the fusion positioning is performed cyclically, the source data may be used for filter reset for each current frame because of the last frame fusion positioning result. For convenience of the following description, the current frame measurement source data is used herein to represent measurement source data for a filter reset after the filter reset is triggered.
Illustratively, first measurement sources with the same cluster marks representing successful clustering are obtained, and the average value of current frame measurement source data corresponding to each first measurement source is calculated by using the following formula;
Wherein X k is the east position, north position and course angle of the measuring source with the same cluster mark as the target measuring source with the first cluster mark not being-1; k represents the identification of the first measurement source, and the value can be k=1, 2,3, … …, C; c is the number of the first measuring sources; And the average east position, the average north position, the average elevation, the average pitch angle, the average roll angle and the average course angle of the first measuring source.
Updating the state quantity of the current filter toThe resetting of the filter is completed.
In another case: if the first measurement source which is successfully clustered does not exist in the target measurement sources according to the clustering mark, determining a second measurement source according to the time stamp corresponding to the target measurement source, and timing by taking the time stamp corresponding to the second measurement source as a timing starting point; if no first measurement source which is clustered successfully exists within the set timeout period, the state quantity of the set filter is updated according to the current frame measurement source data corresponding to the second measurement source, so that the filter is reset.
In the embodiment of the invention, if the clustering marks of all the target measurement sources are-1, which indicates that no first measurement source with successful clustering exists, starting timing from the measurement source with effective first time confidence (namely the second measurement source), and updating the state quantity of the current filter to the current frame measurement source data corresponding to the second measurement source when the timer 1 reaches the fourth time threshold and no first measurement source with successful clustering exists in the period, thereby realizing the resetting of the filter. The fourth time threshold is a single positioning source reset fusion positioning time threshold of 10 s-20 s.
In the embodiment of the invention, if the filter is not successfully reset by the average value of the current frame measurement source data corresponding to the first measurement source or the current frame measurement source data corresponding to the second measurement source, determining that the fusion positioning state is a fusion positioning invalid reset state. And when the fusion positioning is triggered, outputting a fusion positioning invalid reset state, and continuously waiting until the clustering is successful, completing the resetting of the filter based on the average value of the current frame measurement source data corresponding to the first measurement source, or completing the resetting of the filter based on the current frame measurement source data corresponding to the second measurement source without the successful clustering.
S2110, updating the fusion positioning result of the previous frame according to the odometer data to obtain the fusion positioning updating result of the current frame.
In the embodiment of the invention, the fusion positioning result of the previous frame is firstly compensated by using the odometer data, and the fusion positioning result of the previous frame is aligned to the fusion positioning time stamp of the current frame; specifically, fusion positioning update can be performed by the following formula:
Δt=ti-ti-1; (2.9)
D=Vi-1*Δt; (2.10)
R=L/tanSi-1; (2.11)
Cx=xi-1-R*sinθi-1; (2.12)
Cy=yi-1+R*cosθi-1; (2.13)
θ'i=θi-1+D/R; (2.14)
x'i=Cx+D*cosθi; (2.15)
y'i=Cy+D*sinθi; (2.16)
z'i=zi-1; (2.17)
φ'i=φi-1; (2.19)
Wherein t i is the fusion positioning time stamp of the current frame, t i-1 is the fusion positioning time stamp of the previous frame, V i-1 is the speed of the odometer of the previous frame, S -1 is the front wheel corner of the odometer of the previous frame, L is the wheelbase of the vehicle, and X i-1 is the fusion positioning result of the previous frame, and the fusion positioning result comprises an east position, a north position, an elevation, a pitch angle, a roll angle and a course angle. X' i is the fusion positioning result of the current frame after fusion positioning updating, and the fusion positioning result comprises an east position, a north position, an elevation, a pitch angle, a roll angle and a course angle.
S2120, according to the current frame fusion positioning updating result and the last frame fusion positioning result, carrying out prediction updating on the state quantity of the set filter and the variance of the prediction error.
Illustratively, the state quantity of the set filter after prediction update and the variance of the prediction error can be determined using the following formula:
ui=X'i-Xi-1; (2.22)
wherein, Predicting updated state quantities for the filter; u i is the difference between the current frame fusion positioning result and the previous frame fusion positioning result after fusion positioning update; /(I)Predicting the variance of the updated prediction error for the filter; p i-1 is the variance of the filter observation error corresponding to the previous frame fusion positioning; q is a parameter matrix corresponding to the average value of the prediction error w.
S2130, acquiring the weight of each path of target measurement source according to the confidence coefficient and the preset weight value in the current frame measurement source data participating in fusion positioning.
In the embodiment of the invention, for the current frame measurement source data participating in fusion positioning, the weight of each path of target measurement source is obtained according to the ratio of the product of the confidence coefficient included in the current frame measurement source data and the preset weight (the weight can be the same as the whole path or different values can be set in different areas) to the sum of the products of the confidence coefficient and the weight of all measurement sources.
The embodiment of the invention can calculate the weight of each target measurement source participating in fusion by using the following formula:
Wm=cm*wm/(∑kck*wk); (2.25)
Wherein c m is the confidence of the current target measurement source, w m is the preset weight corresponding to the current target measurement source, c k is the confidence of the kth target measurement source, and w k is the preset weight corresponding to the kth target measurement source.
S2140, obtaining the observed quantity of the set filter according to the weight of each path of target measurement source and the corresponding current frame measurement source data.
Illustratively, products of current frame measurement source data and weights corresponding to each target measurement source are calculated respectively, and the corresponding products of each path of target measurement sources are accumulated to obtain an observed quantity Z of the filter.
The embodiment of the invention can calculate the observed quantity Z of the filter by adopting the following formula:
Z=∑mWmX'm; (2.26)
Where W m represents the weight of each target measurement source participating in fusion, and X' m represents each current frame measurement source data participating in fusion.
S2150, according to the prediction error variance after prediction updating and the mean value of the observation error of the setting filter, carrying out observation updating on the observation error variance of the setting filter.
Illustratively, the variance of the updated prediction error is predicted with a filterAnd setting the sum of squares of the mean value R of the observed errors of the filter as a denominator to predict the variance/>, of the updated prediction errors, of the filterAs a numerator, the observation update coefficient K i,Ki is calculated as a matrix.
In the embodiment of the invention, K i can be calculated by adopting the following formula:
wherein, The variance of the prediction error after the prediction update is predicted for the filter, and R is the mean value of the observation error of the filter represented by the parameter matrix.
Calculating the difference between the identity matrix and the observation update coefficient K i, and calculating the variance between the difference and the prediction error after the wave device prediction updateAnd the product of (3) to obtain the variance P i of the observation error after the filter observation update.
In the embodiment of the invention, the following formula can be adopted to calculate P i:
s2160, generating a current frame fusion positioning result according to the obtained observed quantity of the setting filter and the state quantity after prediction and update.
Illustratively, in the case where the filter is reset and the prediction update and the observation update are completed, the observed quantity Z of the filter and the state quantity of the filter after the prediction update are calculatedAnd calculates the product of the difference and the observation update coefficient K i, and then predicts the updated state quantity/>, by the product and the filterAnd adding and calculating to obtain the fusion positioning result of the current frame.
In the embodiment of the invention, the fusion positioning result X i of the current frame can be calculated by the following formula:
Wherein X i is the fusion positioning result of the current frame.
S2170, obtaining the fusion positioning confidence coefficient of the fusion positioning result of the current frame according to the weight and the confidence coefficient of each target measurement source participating in fusion.
The product of the weight and the confidence coefficient of each target measurement source participating in fusion is calculated, and the product of each target measurement source participating in fusion is added and processed to obtain the fusion positioning confidence coefficient of the fusion positioning result of the current frame.
In the embodiment of the invention, the fusion positioning confidence coefficient can be obtained by the following formula:
Ci=∑kck*Wk; (2.33)
Wherein W k is the weight of the target measurement source participating in fusion, C k is the confidence coefficient of the target measurement source participating in fusion, and C i is the fusion positioning confidence coefficient of the current frame.
S2180, determining the fusion positioning state of the fusion positioning result of the current frame according to the fusion positioning confidence.
In the embodiment of the invention, if the fused positioning confidence is higher than the set second confidence threshold, the fused positioning state is updated to high-precision positioning, otherwise, the fused positioning state is updated to low-precision positioning. The second confidence threshold is a high-precision threshold for judging fusion positioning, and the value is 0.7-0.9.
Further, a planning layer of the positioning information unmanned system such as a fusion positioning result and a fusion positioning state is output, so that the positioning information is analyzed through the planning layer to generate a vehicle control strategy.
According to the embodiment of the invention, the filter is reset by using the measurement source clustering center data, so that the filter reset by using single measurement source data can be avoided, the acquisition of biased fusion positioning data is avoided, the weight of the target measurement source is set in advance, and meanwhile, the influence of the target measurement source with low positioning accuracy on fusion positioning is avoided.
Fig. 3 is a flowchart of another fused positioning method according to an embodiment of the present invention, where a step of triggering a filter reset based on a pure motion estimation state and performing prediction update on a state quantity of a set filter and a variance of a prediction error is added. As shown in fig. 3, the method includes:
And S310, acquiring odometer data and measurement source data of at least one measurement source when the fusion positioning event is triggered.
S320, screening out target measurement sources with effective time confidence according to the time stamps in the measurement source data, and updating target measurement source data corresponding to the target measurement sources through the odometer data to generate current frame measurement source data of each path of target measurement sources.
S330, judging whether a target measurement source corresponding to the current frame measurement source data participates in fusion positioning or not according to the fusion positioning result of the current frame measurement source data and the previous frame.
S340, if no target measuring source participates in fusion positioning, updating a last frame fusion positioning result according to the odometer data to obtain a current frame fusion positioning updating result, taking the current frame fusion positioning updating result as the current frame fusion positioning result, and updating the fusion positioning state into a pure motion estimation state.
For example, if no target measurement source participates in fusion positioning, the current frame fusion positioning update result is used as the current frame fusion positioning result, and the fusion positioning state is updated to be a pure motion estimation state.
In the embodiment of the present invention, the current frame fusion positioning update result X' i is calculated by the formula (2.21), and the detailed calculation process is not described here.
In the embodiment of the invention, if no target measurement source participates in fusion, the fusion positioning is output as the fusion positioning result X' i of the current frame, and the fusion positioning state is updated to be the pure motion estimation state.
And S350, acquiring the duration time or the duration distance of the pure motion estimation state, and judging whether the fusion positioning result of the current frame triggers the filter reset or not based on the duration time or the duration distance.
In the embodiment of the invention, if the fused positioning states of the continuous multiframes are all pure motion estimation states, the duration time of the pure motion estimation states of the continuous frames can be counted, and if the duration time exceeds a set fifth time threshold value, the reset of the filter is triggered, namely, the reset process of the filter is entered. The fifth time threshold is a pure motion estimation time threshold, and the value can be 1 s-5 s.
Optionally, if the fused positioning states of the continuous multi-frames are all pure motion estimation states, the continuous distance of the pure motion estimation states of the continuous frames can be counted, and if the continuous distance exceeds a set fifth distance threshold, the reset of the filter is triggered, namely, the reset process of the filter is entered. The fifth distance threshold is a pure motion estimation distance threshold, and the value can be 5 m-10 m.
It should be noted that, the filter resetting process is described in the above embodiments, and is not described herein again.
Specifically, the duration of the pure motion estimation state may be determined according to the difference between the current frame fusion positioning timestamp and the timestamp at which the current fusion positioning pure motion estimation state begins.
The calculation formula of the duration of the pure motion estimation state in the embodiment of the invention is as follows:
ΔT=ti-ts; (3.1)
Wherein t s is a time stamp of starting the fusion positioning pure motion estimation state; t i is the current frame fusion positioning timestamp.
Optionally, a difference value between the fused positioning timestamp of the kth frame and the fused positioning timestamp of the kth-1 frame may be calculated, and a product of the difference value and the vehicle speed of the kth frame odometer may be calculated, and then, a sum of all the products is calculated in a time period from the timestamp of the current fused positioning pure motion estimation state to the fused positioning timestamp of the previous frame, to obtain the continuous distance of the pure motion estimation state.
The calculation formula of the continuous distance of the pure motion estimation state in the embodiment of the invention is as follows:
Wherein t s is a time stamp of starting the fusion positioning pure motion estimation state; t i is the fusion positioning time stamp of the current frame; v k is the kth frame odometer speed.
According to the embodiment of the invention, the fusion positioning can be actively reset by judging the pure motion estimation state, so that the fusion positioning is recovered to be normal.
Fig. 4 is a flowchart of another fusion positioning method according to an embodiment of the present invention, where after generating a current frame fusion positioning result based on corresponding current frame measurement source data by using a setting filter, the embodiment further includes determining whether each path of target measurement source is in an abnormal state, and indicating that the abnormal target measurement source is reset. As shown in FIG. 4, the method includes
S410, acquiring odometer data and measurement source data of at least one measurement source when the fusion positioning event is triggered.
S420, screening out target measurement sources with effective time confidence according to the time stamps in the measurement source data, updating target measurement source data corresponding to the target measurement sources through the odometer data, and generating current frame measurement source data of each path of target measurement sources.
S430, judging whether a target measurement source corresponding to the current frame measurement source data participates in fusion positioning or not according to the fusion positioning result of the current frame measurement source data and the previous frame.
S440, for the target measurement sources participating in fusion positioning, a set filter is adopted to generate a current frame fusion positioning result based on the corresponding current frame measurement source data.
S450, determining a third distance difference value and a third course angle difference value of current frame measurement source data of each path of target measurement source and a second fusion positioning result of the current frame, wherein the second fusion positioning result of the current frame comprises position data and a course angle in the fusion positioning result of the current frame.
In the embodiment of the invention, the following formula can be used for calculating a third distance difference d mi and a third heading difference delta theta mi between the measurement source data of the current frame and the second fusion positioning of the current frame:
Δθmi=|θmi|; (4.2)
Wherein x m,ymm is the east position, north position and course angle included in the current frame measurement source data, and x i,yii is the east position, north position and course angle included in the current frame second fusion positioning result.
And S460, judging whether each path of target measurement source is in an abnormal state according to the third distance difference value and the third course angle difference value, if so, executing S470, otherwise, executing S480.
In the embodiment of the invention, when the distance d mi between the current frame measurement source and the fusion positioning is greater than a set sixth distance threshold and the heading difference delta theta mi is greater than a set fifth heading threshold, the abnormal starting time stamp t ms of the target measurement source is recorded. And when the duration abnormality time delta T of the target measurement source is larger than the set sixth time threshold, determining that the state of the target measurement source is abnormal, otherwise, determining that the state of the target measurement source is normal. The sixth time threshold is a measurement source abnormal time threshold, and the value can be 1 s-2 s. The sixth distance threshold is a measurement source abnormal distance threshold, and the value can be 0.8 m-1.0 m. The fifth heading threshold is a measurement source abnormal heading threshold, and the value can be 0.5 deg-1.5 deg.
In the embodiment of the invention, the continuous abnormal time delta T of the target measurement source can be calculated by using the following formula:
ΔT=ti-tms; (4.4)
wherein t i is a fusion positioning timestamp of the current frame, and t ms is a target measurement source anomaly start timestamp.
S470, sending abnormal state information to the corresponding target measurement source so as to instruct the corresponding target measurement source to reset.
In the embodiment of the invention, if the target measuring source is in an abnormal state, abnormal state information is sent to the corresponding target measuring source so as to indicate the corresponding target measuring source to reset. If the measurement source completes the reset, the corresponding measurement source may continue to participate in the fusion. For measurement sources that do not complete a reset, their temporal confidence is invalid and do not participate in the fusion.
S480, determining that the target measurement source is in a normal state. According to the embodiment of the invention, the state judgment is carried out on the target measuring source, so that the abnormal measuring source is reset, the abnormal measuring source is positioned and is recovered to be normal, and the fusion positioning robustness of the abnormal measuring source is improved.
Fig. 5 is a schematic structural diagram of a fusion positioning device according to an embodiment of the present invention. As shown in fig. 5, the apparatus includes:
the data acquisition module 510 is configured to acquire odometer data and measurement source data of at least one measurement source when triggered by a fusion positioning event;
The data updating module 520 is configured to screen out a target measurement source with an effective time confidence according to a timestamp in the measurement source data, update target measurement source data corresponding to the target measurement source by using odometer data, and generate current frame measurement source data of each path of target measurement source;
the participation fusion judging module 530 is configured to judge whether a target measurement source corresponding to the current frame measurement source data participates in fusion positioning according to the fusion positioning result of the current frame measurement source data and the previous frame;
The fusion positioning module 540 is configured to generate, for a target measurement source involved in fusion positioning, a current frame fusion positioning result based on corresponding current frame measurement source data by using a set filter.
Optionally, after the data acquisition module 510, the method further includes:
And the marking module is used for clustering the target measuring sources based on the current frame measuring source data of each path of target measuring source and attaching a clustering mark to each path of target measuring source according to the clustering result.
Optionally, the marking module includes:
the first measurement source determining unit is used for determining a first measurement source which is successfully clustered in the target measurement sources according to the clustering mark;
The difference value determining unit is used for obtaining cluster center data of the first measuring source and determining a first distance difference value and a first course difference value of a first fusion positioning result of the cluster center data and the current frame, wherein the first fusion positioning result of the current frame comprises position data and a course angle in the fusion positioning result of the current frame;
The time stamp determining unit is used for determining a starting time stamp of the fusion positioning deviation according to the first distance difference value and the first course angle difference value;
The deviation state determining unit is used for recording duration time of fusion positioning deviation based on the starting time stamp, and determining that the fusion positioning state is a deviation state when the duration time meets the set condition;
And the filter resetting unit is used for determining that the first fusion positioning result of the current frame triggers the filter resetting when the fusion positioning state is in a biased state.
Optionally, after the filter resetting unit, the method further includes:
and the average value updating subunit is used for acquiring the average value of the current frame measurement source data corresponding to each first measurement source, and updating and setting the state quantity of the filter according to the average value so as to reset the filter.
Optionally, the marking module further includes:
the timing unit is used for determining a second measurement source according to the time stamp corresponding to the target measurement source if the first measurement source which is successfully clustered does not exist in the target measurement source according to the clustering mark, and timing by taking the time stamp corresponding to the second measurement source as a timing starting point;
And the state quantity updating unit is used for updating the state quantity of the set filter according to the current frame measurement source data corresponding to the second measurement source if no first measurement source which is clustered successfully exists within the set timeout time so as to realize the reset of the filter.
Optionally, after the filter resetting unit, the method further includes:
The positioning result updating unit is used for updating the last frame fusion positioning result according to the odometer data to obtain a current frame fusion positioning updating result;
And the prediction updating unit is used for performing prediction updating on the state quantity of the set filter and the variance of the prediction error according to the current frame fusion positioning updating result and the last frame fusion positioning result.
Optionally, the prediction update unit includes:
the weight determination subunit is used for acquiring the weight of each path of target measurement source according to the confidence coefficient in the current frame measurement source data participating in fusion positioning and a preset weight;
The observed quantity determining subunit is used for obtaining the observed quantity of the set filter according to the weight of each path of target measurement source and the corresponding current frame measurement source data;
and the fusion positioning result acquisition subunit is used for generating a current frame fusion positioning result according to the acquired observed quantity of the setting filter and the predicted and updated state quantity.
Optionally, after the observational determination subunit, further includes:
And the variance updating subunit is used for carrying out observation updating on the variance of the observation error of the set filter according to the variance of the prediction error after the prediction updating and the average value of the observation error of the set filter.
Optionally, after the observational determination subunit, further includes:
the confidence coefficient determining subunit is used for obtaining the fusion positioning confidence coefficient of the fusion positioning result of the current frame according to the weight and the confidence coefficient of each target measurement source participating in fusion;
And the fusion positioning state determination subunit is used for determining the fusion positioning state of the fusion positioning result of the current frame according to the fusion positioning confidence.
Optionally, the filter resetting unit further includes:
And the pure motion estimation state subunit is used for updating the fusion positioning state to the pure motion estimation state by taking the fusion positioning updating result of the current frame as the fusion positioning result of the current frame if no target measurement source participates in fusion positioning.
Optionally, after the pure motion estimation state subunit, further comprising:
And the triggering condition judging subunit is used for acquiring the duration time or the duration distance of the pure motion estimation state and judging whether the fusion positioning result of the current frame triggers the filter reset or not based on the duration time or the duration distance.
Optionally, the data update module 520 includes:
The time stamp difference value determining unit is used for determining a time stamp difference value according to the current time stamp and the time stamp in the target measurement source data;
The current frame measuring source generating unit is used for generating a compensation coefficient according to the timestamp difference value, the odometer data and the vehicle wheelbase, updating the target measuring source data corresponding to each path of target measuring source according to the compensation coefficient, and generating the current frame measuring source data of each path of target measuring source.
Optionally, the participation fusion judgment module 530 includes:
The second difference value determining unit is used for determining a second distance difference value and a second course angle difference value of the fusion positioning result of the current frame measurement source data and the previous frame of each path of target measurement source;
The threshold determining unit is used for acquiring a preset first group of thresholds and a preset second group of thresholds, wherein the second group of thresholds are larger than the first group of thresholds, and each group of thresholds comprises a distance threshold and a heading threshold;
the participation fusion positioning determination unit is used for determining that the corresponding target measuring source participates in fusion positioning when the second distance difference value and the second course angle difference value are smaller than the first group of threshold values for each path of target measuring source;
the comparison unit is used for comparing the second distance difference value and the second course angle difference value with a second group of threshold values when the second distance difference value or the second course angle difference value is equal to or larger than the first group of threshold values; and if the second distance difference value and the second course angle difference value are smaller than a second group of threshold values, determining that the corresponding target measuring source participates in fusion positioning.
Optionally, after fusing the positioning module 540, the method further includes:
The third difference value determining unit is used for determining a third distance difference value and a third course angle difference value between the current frame measurement source data of each path of target measurement source and a current frame second fusion positioning result, wherein the current frame second fusion positioning result comprises position data and a course angle in the current frame fusion positioning result;
The abnormal state determining unit is used for determining whether each path of target measuring source is in an abnormal state according to the third distance difference value and the third course angle difference value; if yes, the abnormal state information is sent to the corresponding target measuring source so as to instruct the corresponding target measuring source to reset.
The fusion positioning device provided by the embodiment of the invention can execute the fusion positioning method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
Fig. 6 shows a schematic structural diagram of a vehicle 10 that may be used to implement an embodiment of the present invention. As shown in fig. 6, the vehicle 10 includes at least one processor 11, and a memory, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, etc., communicatively connected to the at least one processor 11, in which the memory stores a computer program executable by the at least one processor, and the processor 11 can perform various appropriate actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from the storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data required for the operation of the vehicle 10 may also be stored. The processor 11, the ROM 12 and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
Various components in the vehicle 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, etc.; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the vehicle 10 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunications networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 11 performs the various methods and processes described above, such as a fusion positioning method.
In some embodiments, a fusion positioning method can be implemented as a computer program tangibly embodied on a computer-readable storage medium, such as storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the vehicle 10 via the ROM 12 and/or the communication unit 19. One or more of the steps of a fusion positioning method described above may be performed when the computer program is loaded into RAM 13 and executed by processor 11. Alternatively, in other embodiments, the processor 11 may be configured to perform a fusion positioning method in any other suitable manner (e.g., by means of firmware).
Various implementations of the systems and techniques described here above can be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), complex Programmable Logic Devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a vehicle having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or a trackball) by which a user can provide input to the vehicle. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (17)

1. A fusion positioning method, comprising:
Acquiring odometer data and measurement source data of at least one path of measurement sources when the fusion positioning event is triggered;
Screening out target measurement sources with effective time confidence according to the time stamp in the measurement source data, updating target measurement source data corresponding to the target measurement sources through the odometer data, and generating current frame measurement source data of each path of target measurement sources;
Judging whether a target measurement source corresponding to the current frame measurement source data participates in fusion positioning or not according to the fusion positioning result of the current frame measurement source data and the previous frame;
and for the target measurement sources participating in fusion positioning, a set filter is adopted to generate a current frame fusion positioning result based on the corresponding current frame measurement source data.
2. The method of claim 1, further comprising, after generating the current frame measurement source data for each path of target measurement sources:
Clustering the target measurement sources based on current frame measurement source data of each path of target measurement source, and attaching a clustering mark to each path of target measurement source according to a clustering result.
3. The method as recited in claim 2, further comprising:
determining a first measurement source which is successfully clustered in the target measurement sources according to the clustering mark;
Acquiring cluster center data of the first measurement source, and determining a first distance difference value and a first course difference value of the cluster center data and a first fusion positioning result of a current frame, wherein the first fusion positioning result of the current frame comprises position data and a course angle in the fusion positioning result of the current frame;
determining a starting time stamp of the fusion positioning deviation according to the first distance difference value and the first course angle difference value;
Recording the duration time of the fusion positioning deviation based on the starting time stamp, and determining the fusion positioning state as a biased state when the duration time meets a set condition;
And when the fusion positioning state is a biased state, determining that the first fusion positioning result of the current frame triggers the filter reset.
4. The method as recited in claim 2, further comprising:
If the first measurement source which is successfully clustered does not exist in the target measurement sources according to the clustering mark, determining a second measurement source according to the time stamp corresponding to the target measurement source, and timing by taking the time stamp corresponding to the second measurement source as a timing starting point;
And if the first measurement sources which are clustered successfully do not exist within the set timeout time, updating the state quantity of the set filter according to the current frame measurement source data corresponding to the second measurement source so as to realize filter reset.
5. The method of claim 4, further comprising, after the filter reset:
Updating the last frame fusion positioning result according to the odometer data to obtain a current frame fusion positioning updating result;
And according to the current frame fusion positioning updating result and the last frame fusion positioning result, predicting and updating the state quantity of the set filter and the variance of the prediction error.
6. The method of claim 5, wherein generating a current frame fusion positioning result based on corresponding current frame measurement source data using a set filter comprises:
Acquiring the weight of each path of target measurement source according to the confidence coefficient in the current frame measurement source data participating in fusion positioning and a preset weight;
Obtaining the observed quantity of the set filter according to the weight of each path of target measurement source and the corresponding current frame measurement source data;
And generating a current frame fusion positioning result according to the obtained observed quantity of the setting filter and the predicted and updated state quantity.
7. The method as recited in claim 5, further comprising:
If no target measurement source participates in fusion positioning, the current frame fusion positioning updating result is used as the current frame fusion positioning result, and the fusion positioning state is updated into a pure motion estimation state.
8. The method of claim 7, further comprising, after updating the fused localization state to the pure motion estimation state:
And acquiring the duration time or the duration distance of the pure motion estimation state, and judging whether the fusion positioning result of the current frame triggers the filter reset or not based on the duration time or the duration distance.
9. A method according to claim 3, further comprising, after triggering a filter reset by the current frame first fused positioning result:
and acquiring an average value of current frame measurement source data corresponding to each first measurement source, and updating the state quantity of the set filter according to the average value to realize filter reset.
10. The method of claim 6, further comprising, after obtaining the observed quantity of the setting filter based on the weights of each path of target measurement sources and the corresponding current frame measurement source data:
And according to the variance of the prediction error after prediction updating and the average value of the observation error of the setting filter, carrying out observation updating on the variance of the observation error of the setting filter.
11. The method of claim 6, further comprising, after obtaining the weights for each path of target measurement sources:
Acquiring fusion positioning confidence coefficient of a fusion positioning result of the current frame according to the weight and the confidence coefficient of each target measurement source participating in fusion;
and determining the fusion positioning state of the fusion positioning result of the current frame according to the fusion positioning confidence.
12. The method of claim 1, wherein updating the target measurement source data corresponding to the target measurement source by the odometer data to generate current frame measurement source data for each path of target measurement source comprises:
Determining a timestamp difference value according to the current timestamp and the timestamp in the target measurement source data;
and generating a compensation coefficient according to the timestamp difference value, the odometer data and the vehicle wheelbase, updating target measurement source data corresponding to each path of target measurement source according to the compensation coefficient, and generating current frame measurement source data of each path of target measurement source.
13. The method according to claim 1, wherein the determining whether the target measurement source corresponding to the current frame measurement source data participates in the fusion positioning according to the fusion positioning result of the current frame measurement source data and the previous frame includes:
determining a second distance difference value and a second course angle difference value of the fusion positioning result of the current frame measurement source data of each path of target measurement source and the previous frame;
Acquiring a preset first group of thresholds and a preset second group of thresholds, wherein the second group of thresholds is larger than the first group of thresholds, and each group of thresholds comprises a distance threshold and a heading threshold;
for each path of target measurement source, determining that the corresponding target measurement source participates in fusion positioning when the second distance difference value and the second course angle difference value are smaller than the first group of threshold values;
comparing the second distance difference and the second heading angle difference with the second set of thresholds when the second distance difference or the second heading angle difference is equal to or greater than the first set of thresholds;
and if the second distance difference value and the second course angle difference value are smaller than the second group of threshold values, determining that the corresponding target measuring source participates in fusion positioning.
14. The method of claim 1, further comprising, after generating a current frame fusion positioning result based on the corresponding current frame measurement source data using the set filter:
Determining a third distance difference value and a third course angle difference value of current frame measurement source data of each path of target measurement source and a current frame second fusion positioning result, wherein the current frame second fusion positioning result comprises position data and a course angle in the current frame fusion positioning result;
Judging whether each path of target measuring source is in an abnormal state according to the third distance difference value and the third course angle difference value;
If yes, the abnormal state information is sent to the corresponding target measuring source so as to instruct the corresponding target measuring source to reset.
15. A fusion positioning device, comprising:
The data acquisition module is used for acquiring odometer data and measurement source data of at least one path of measurement sources when the fusion positioning event is triggered;
The data updating module is used for screening out target measurement sources with effective time confidence according to the time stamp in the measurement source data, updating the target measurement source data corresponding to the target measurement sources through the odometer data and generating current frame measurement source data of each path of target measurement sources;
the participation fusion judging module is used for judging whether a target measurement source corresponding to the current frame measurement source data participates in fusion positioning or not according to the fusion positioning result of the current frame measurement source data and the previous frame;
and the fusion positioning module is used for generating a current frame fusion positioning result based on corresponding current frame measurement source data by adopting a set filter for the target measurement sources participating in fusion positioning.
16. A vehicle, characterized in that the vehicle comprises:
At least one measurement source for providing measurement source data during travel of the vehicle;
an odometer for providing odometer data during travel of the vehicle;
At least one processor; and
A memory communicatively coupled to the at least one processor; wherein,
The memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the fusion positioning method of any one of claims 1-14.
17. A computer readable storage medium storing computer instructions for causing a processor to perform the fusion positioning method of any one of claims 1-14.
CN202210540746.6A 2022-05-17 2022-05-17 Fusion positioning method, device, vehicle and storage medium Active CN114964270B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210540746.6A CN114964270B (en) 2022-05-17 2022-05-17 Fusion positioning method, device, vehicle and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210540746.6A CN114964270B (en) 2022-05-17 2022-05-17 Fusion positioning method, device, vehicle and storage medium

Publications (2)

Publication Number Publication Date
CN114964270A CN114964270A (en) 2022-08-30
CN114964270B true CN114964270B (en) 2024-04-26

Family

ID=82982565

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210540746.6A Active CN114964270B (en) 2022-05-17 2022-05-17 Fusion positioning method, device, vehicle and storage medium

Country Status (1)

Country Link
CN (1) CN114964270B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115856976B (en) * 2023-02-27 2023-06-02 智道网联科技(北京)有限公司 Fusion positioning method and device for automatic driving vehicle and electronic equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017125820A (en) * 2016-01-15 2017-07-20 三菱電機株式会社 Information processing apparatus, information processing method, and information processing program
CN109099920A (en) * 2018-07-20 2018-12-28 重庆长安汽车股份有限公司 Sensor target accurate positioning method based on Multisensor association
CN109579844A (en) * 2018-12-04 2019-04-05 电子科技大学 Localization method and system
WO2019114807A1 (en) * 2017-12-15 2019-06-20 蔚来汽车有限公司 Multi-sensor target information fusion
CN110631574A (en) * 2018-06-22 2019-12-31 北京自动化控制设备研究所 inertia/odometer/RTK multi-information fusion method
CN111121755A (en) * 2020-01-02 2020-05-08 广东博智林机器人有限公司 Multi-sensor fusion positioning method, device, equipment and storage medium
CN111141273A (en) * 2019-12-18 2020-05-12 无锡北微传感科技有限公司 Combined navigation method and system based on multi-sensor fusion
CN113327344A (en) * 2021-05-27 2021-08-31 北京百度网讯科技有限公司 Fusion positioning method, device, equipment, storage medium and program product

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9784582B2 (en) * 2011-09-14 2017-10-10 Invensense, Inc. Method and apparatus for navigation with nonlinear models
CN113945206B (en) * 2020-07-16 2024-04-19 北京图森未来科技有限公司 Positioning method and device based on multi-sensor fusion

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017125820A (en) * 2016-01-15 2017-07-20 三菱電機株式会社 Information processing apparatus, information processing method, and information processing program
WO2019114807A1 (en) * 2017-12-15 2019-06-20 蔚来汽车有限公司 Multi-sensor target information fusion
CN110631574A (en) * 2018-06-22 2019-12-31 北京自动化控制设备研究所 inertia/odometer/RTK multi-information fusion method
CN109099920A (en) * 2018-07-20 2018-12-28 重庆长安汽车股份有限公司 Sensor target accurate positioning method based on Multisensor association
CN109579844A (en) * 2018-12-04 2019-04-05 电子科技大学 Localization method and system
CN111141273A (en) * 2019-12-18 2020-05-12 无锡北微传感科技有限公司 Combined navigation method and system based on multi-sensor fusion
CN111121755A (en) * 2020-01-02 2020-05-08 广东博智林机器人有限公司 Multi-sensor fusion positioning method, device, equipment and storage medium
CN113327344A (en) * 2021-05-27 2021-08-31 北京百度网讯科技有限公司 Fusion positioning method, device, equipment, storage medium and program product

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Design of Intelligent Mobile Robot Positioning Algorithm Based on IMU/Odometer/Lidar;Zhaodong Li;2019 International Conference on Sensing, Diagnostics, Prognostics, and Control;20200918;全文 *
基于多传感器融合的车载移动测图***研究;陈允芳;叶泽田;;测绘通报;20070125(第01期);全文 *

Also Published As

Publication number Publication date
CN114964270A (en) 2022-08-30

Similar Documents

Publication Publication Date Title
US10422658B2 (en) Method, fusion filter, and system for fusing sensor signals with different temporal signal output delays into a fusion data set
JP5017392B2 (en) Position estimation apparatus and position estimation method
CN111077549B (en) Position data correction method, apparatus and computer readable storage medium
CN107884800B (en) Combined navigation data resolving method and device for observation time-lag system and navigation equipment
CN114179825B (en) Method for obtaining confidence of measurement value through multi-sensor fusion and automatic driving vehicle
CN114964270B (en) Fusion positioning method, device, vehicle and storage medium
CN109059907A (en) Track data processing method, device, computer equipment and storage medium
US20150294453A1 (en) Image analysis apparatus mounted to vehicle
CN110637209B (en) Method, apparatus and computer readable storage medium having instructions for estimating a pose of a motor vehicle
CN107943859B (en) System and method for collecting, processing and feeding back mass sensor data
WO2023142353A1 (en) Pose prediction method and apparatus
CN115752471A (en) Sensor data processing method and device and computer readable storage medium
CN111183464B (en) System and method for estimating saturation flow of signal intersection based on vehicle trajectory data
WO2017141469A1 (en) Position estimation device
CN114119744A (en) Method, device and equipment for constructing point cloud map and storage medium
WO2020018140A1 (en) Ballistic estimnation of vehicle data
CN113310505A (en) External parameter calibration method and device of sensor system and electronic equipment
CN117029857A (en) Vehicle perception fusion method, device and storage medium based on high-precision map
CN113932815B (en) Robustness optimization Kalman filtering relative navigation method, device, equipment and storage medium
CN111829552B (en) Error correction method and device for visual inertial system
CN113865586A (en) Estimation method and device of installation angle and automatic driving system
CN115037703A (en) Data processing method, data processing apparatus, computer storage medium, and computer program product
CN115328893A (en) Data processing method, device, equipment and computer storage medium
CN112346479A (en) Unmanned aircraft state estimation method based on centralized Kalman filtering
CN111191734A (en) Sensor data fusion method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant