CN111586632B - Cooperative neighbor vehicle positioning method based on communication sensing asynchronous data fusion - Google Patents

Cooperative neighbor vehicle positioning method based on communication sensing asynchronous data fusion Download PDF

Info

Publication number
CN111586632B
CN111586632B CN202010372427.XA CN202010372427A CN111586632B CN 111586632 B CN111586632 B CN 111586632B CN 202010372427 A CN202010372427 A CN 202010372427A CN 111586632 B CN111586632 B CN 111586632B
Authority
CN
China
Prior art keywords
vehicle
neighbor
state
time
particle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010372427.XA
Other languages
Chinese (zh)
Other versions
CN111586632A (en
Inventor
单杭冠
洪春华
项志宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN202010372427.XA priority Critical patent/CN111586632B/en
Publication of CN111586632A publication Critical patent/CN111586632A/en
Application granted granted Critical
Publication of CN111586632B publication Critical patent/CN111586632B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/123Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/52Network services specially adapted for the location of the user terminal

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

The invention discloses a cooperative neighbor vehicle positioning method based on communication sensing asynchronous data fusion, which comprises the following steps: (1) the vehicle obtains multisource asynchronous state measurement information of a neighbor vehicle by utilizing communication, sensing and other technologies; (2) the vehicle realizes multi-source asynchronous data fusion according to a particle filtering method, and the positioning precision of the neighbor vehicle is improved. The method considers the scene that vehicles with communication capacity carry out positioning evaluation on all neighbor vehicles when running on the road, combines the advantages of communication and sensors, has wide measurement range and strong reliability, considers the possibility of communication and sensing failure and has strong adaptability; meanwhile, the positioning precision is obviously improved through multi-source asynchronous data fusion, and the driving safety of the vehicle is guaranteed.

Description

Cooperative neighbor vehicle positioning method based on communication sensing asynchronous data fusion
Technical Field
The invention belongs to the technical field of vehicle positioning, and particularly relates to a cooperative neighbor vehicle positioning method based on communication sensing asynchronous data fusion.
Background
In the field of intelligent transportation, positioning errors of neighboring vehicles can cause control errors and potential safety hazards of the vehicles, and the method has a key influence effect on the safety of the vehicles and the efficiency of cooperative driving. The traditional neighbor vehicle positioning method mainly utilizes sensors such as radar and a camera to carry out measurement, but the defects of limited measurement range of the sensors, serious influence of sight distance and weather and the like cause the method for positioning the neighbor vehicle by utilizing the sensors to be unstable and have larger error. The development of the car networking technology enables the vehicles to exchange state information, the development of the cooperative vehicle positioning technology is promoted, however, the communication has the problems of packet loss, time delay and the like, and certain errors can exist in the measurement information of the states of the position, the speed and the like of the vehicle transmitted through the communication; therefore, improving the positioning accuracy of the neighboring vehicle remains a problem to be solved.
In the field of vehicle positioning, U.S. patent publication No. US2019346860 proposes an automatic vehicle positioning method, in which a vehicle calculates the relative orientation and the self speed with respect to two transmission points by using millimeter-wave signals transmitted from at least two 5G transmission points, and then positions the vehicle. Chinese patent publication No. CN110657812 proposes a vehicle positioning method and apparatus, in which road features are extracted from an image collected by a camera, and the extracted road features are matched with a navigation map, so as to determine information of a vehicle. Brambilla et al, in the document, "close Vehicle position by cooperative feature association and tracking in Vehicle networks, IEEE Statistical Signal Processing Workshop, 2018, propose a distributed Bayes data association and positioning method, the Vehicle first locates a series of passive feature targets (roadside pedestrians, etc.) through V2V (Vehicle-to-Vehicle) communication and neighbor Vehicle cooperation, and then improve the positioning accuracy of the Vehicle, but the above prior art locates the Vehicle and does not consider locating the neighbor Vehicle. Nam et al propose a Cooperative neighbor vehicle positioning system in a document CNVPS, Cooperative neighbor vehicle positioning system based on neighbor-to-neighbor communication, IEEE Access, January 2019, wherein each vehicle in the system measures positioning information of neighbor vehicles, then shares the positioning information to all neighbor vehicles through V2V communication, and positions the neighbor vehicles by utilizing maximum likelihood estimation after obtaining a plurality of positioning measurement information of the neighbor vehicles, so as to improve the positioning accuracy.
Disclosure of Invention
In view of the above, the invention provides a cooperative neighbor vehicle positioning method based on communication sensing asynchronous data fusion, which can obviously improve the positioning accuracy of neighbor vehicles.
A cooperative neighbor vehicle positioning method based on communication sensing asynchronous data fusion comprises the following steps:
(1) periodically measuring motion state information including positions and speeds of the self vehicle and the neighbor vehicles, broadcasting the self state information to all the neighbor vehicles through V2V, and receiving state measurement information broadcasted by the communication of the neighbor vehicles;
(2) extracting a vehicle ID, information acquisition time, a position and a speed from the obtained neighbor vehicle information, and judging whether a new filtering process needs to be established for the neighbor vehicle according to the neighbor vehicle ID and the information acquisition time;
(3) and (3) positioning estimation is carried out on the neighbor vehicle by adopting a communication sensing asynchronous data fusion algorithm, namely, the particle state at the current moment is predicted according to the particle state at the last moment in the filtering process, then the obtained neighbor vehicle state measurement information is utilized to carry out particle weight updating, and finally, the estimation of the motion state of the neighbor vehicle is completed and the particles are resampled.
Further, in the step (1), the own vehicle is denoted as i, and any neighbor vehicle thereof is denoted as j, and the motion states of the vehicles i and j at the time t are expressed as follows:
Figure BDA0002478636530000021
Figure BDA0002478636530000022
wherein: si(t) and Sj(t) the motion states of the vehicles i and j at the time t respectively,
Figure BDA0002478636530000023
and
Figure BDA0002478636530000024
the lateral positions of vehicles i and j at time t respectively,
Figure BDA0002478636530000025
and
Figure BDA0002478636530000026
longitudinal positions v of vehicles i and j, respectively, at time ti(t) and vj(t) the speed of the vehicles i and j at time t, θi(t) and θjAnd (T) is the included angle between the driving direction of the vehicles i and j and the direction of the east at the moment T respectively, T represents transposition, and T is a non-negative real number.
Further, the k-th state measurement information obtained by the own vehicle i in the step (1) is represented as
Figure BDA0002478636530000031
k is a natural number, rkE.g. { s, g }, when rkS means that the information is measured by a sensor of vehicle i and
Figure BDA0002478636530000032
Sj,k=Sj(tk),Sj(tk) Corresponding t to vehicle jkThe state of motion at the moment in time,
Figure BDA0002478636530000033
the error is measured for the state of the vehicle i itself,
Figure BDA0002478636530000034
sensing measurement error of the vehicle j for the vehicle i; when r iskG indicates that the information is obtained by vehicle i communication reception and
Figure BDA0002478636530000035
Figure BDA0002478636530000036
the error is measured for the state of the vehicle j itself.
Further, in the step (2), for tkObtaining kth state measurement information of neighbor vehicle j at any moment
Figure BDA0002478636530000037
Judging whether the neighbor vehicle j is a new neighbor vehicle according to the ID of the neighbor vehicle j, if so, judging that the neighbor vehicle j is a new neighbor vehicleEstablishing a new filtering process for the neighbor vehicle, if not, indicating that the filtering process of the neighbor vehicle exists in the own vehicle i, and further acquiring time t according to the obtained last state measurement information of the own vehiclek-1To judge
Figure BDA0002478636530000038
The aging property of (2): if tk-tk-1If the time interval is longer than aT, the interval time is longer, the original filtering process is deleted, a filtering process is newly built for the neighbor vehicle, otherwise, the processing is continued on the original filtering process, T is the sensor measurement period, and a is a preset timeliness parameter and a positive integer.
Further, in the communication sensing asynchronous data fusion algorithm in the step (3), when r is 0 for a neighbor vehicle j, i.e., k, for which a filtering process needs to be newly established in the vehicle i, when r iskWhen the number is equal to s,
Figure BDA0002478636530000039
Sj,0satisfy the mean value of
Figure BDA00024786365300000310
The covariance matrix is
Figure BDA00024786365300000311
(ii) a gaussian distribution of; when r iskWhen the ratio is equal to g,
Figure BDA00024786365300000312
Sj,0satisfy the mean value of
Figure BDA00024786365300000313
The covariance matrix is
Figure BDA00024786365300000314
(ii) a gaussian distribution of; and then at known Sj,0After the distribution, each particle state is randomly sampled from the distribution
Figure BDA00024786365300000315
Each particle is weighted to
Figure BDA00024786365300000316
M1, 2, M is the number of particles.
Further, the covariance matrix
Figure BDA00024786365300000317
Is/are as follows
Figure BDA00024786365300000318
Respectively set up as follows:
Figure BDA00024786365300000319
wherein:
Figure BDA00024786365300000320
and
Figure BDA00024786365300000321
the noise variance about the self transverse position, longitudinal position, speed and angle is respectively obtained for the vehicle i through GPS measurement,
Figure BDA00024786365300000322
and
Figure BDA00024786365300000323
the noise variance about the own lateral position, longitudinal position, speed and angle is obtained for the vehicle j through GPS measurement respectively,
Figure BDA00024786365300000324
and
Figure BDA00024786365300000325
the noise variances with respect to the lateral relative position, the longitudinal relative position, the relative speed and the relative angle of the vehicle j are respectively obtained for the vehicle i through sensor measurement.
Further, the communication sensing asynchronous data fusion algorithm in the step (3) filters existing data in the vehicle iUnder the condition that a neighbor vehicle j of a wave process is not equal to 0, the method for predicting the particle state at the current time according to the particle state at the previous time in the filtering process comprises the following steps: if rk-1=rkI.e. tk-1Time t andkthe state measurement information of the neighbor vehicle j at the moment is obtained in the same way for the vehicle i, and t is satisfiedk-tk-1bT and b is a positive integer less than a; for each particle, randomly sampling from the noise distribution to obtain the value of the vehicle j at tk-1Temporal motion velocity noise
Figure BDA0002478636530000041
And angle of motion noise
Figure BDA0002478636530000042
And then combine tk-1State of each particle at a time
Figure BDA0002478636530000043
And the discrete nonlinear dynamic model is predicted to obtain tk-1Motion state S of vehicle j at + T timej(tk-1+ T), so as to obtain TkState of each particle at a time
Figure BDA0002478636530000044
M is the number of particles;
if rk-1≠rkI.e. tk-1Time t andkthe state measurement information of the neighbor vehicle j at the moment is obtained by different modes for the vehicle i, and t is satisfiedk-tk-1T + τ, τ being the information acquisition time difference between vehicles i and j, τ < T; for each particle, according to tk-1State of each particle at a time
Figure BDA0002478636530000045
And the discrete nonlinear dynamic model gradually predicts to obtain tk-1+ bT time and tk-1Motion state S of vehicle j at time + (b +1) Tj(tk-1+ bT) and Sj(tk-1T) (b +1), then TkTime of day per particle state
Figure BDA0002478636530000046
Obtained by the following smoothing algorithm:
Figure BDA0002478636530000047
wherein: and alpha is tau/T.
Further, the discrete nonlinear dynamical model expression is as follows:
Figure BDA0002478636530000048
Figure BDA0002478636530000049
Figure BDA00024786365300000410
Figure BDA00024786365300000411
wherein:
Figure BDA00024786365300000412
the lateral position of vehicle j at time T + T,
Figure BDA00024786365300000413
is the longitudinal position, v, of vehicle j at time T + Tj(T + T) is the speed of vehicle j at time T + T, θj(T + T) is the included angle between the driving direction of the vehicle j and the east-ward direction at the moment T + T,
Figure BDA00024786365300000414
and
Figure BDA00024786365300000415
respectively, the movement of the vehicle j at the time tDynamic velocity noise and motion angle noise.
Further, for a neighbor vehicle j, that is, k ≠ 0, in which a filtering process exists in the vehicle i, the communication sensing asynchronous data fusion algorithm in the step (3) updates the particle weight by using the obtained neighbor vehicle state measurement information, and includes: for each particle, the particle weight is updated by the following formula;
Figure BDA0002478636530000051
wherein:
Figure BDA0002478636530000052
and
Figure BDA0002478636530000053
are each tk-1Time t andkweight of each particle at a time, when rkWhen the number is equal to s,
Figure BDA0002478636530000054
is composed of
Figure BDA0002478636530000055
Has a probability density function of
Figure BDA0002478636530000056
The value of (d); when r iskWhen the ratio is equal to g,
Figure BDA0002478636530000057
is composed of
Figure BDA0002478636530000058
Has a probability density function of
Figure BDA0002478636530000059
The value of (c).
Further, the communication sensing asynchronous data fusion algorithm in the step (3) calculates a motion state estimation value of a neighbor vehicle j through the following formula under the condition that the neighbor vehicle j with a filtering process in the vehicle i is not equal to 0;
Figure BDA00024786365300000510
Figure BDA00024786365300000511
wherein:
Figure BDA00024786365300000512
is tkThe estimate of the state of motion of the neighboring vehicle j at time instant.
Further, in the communication sensing asynchronous data fusion algorithm in the step (3), for a neighboring vehicle j, that is, k ≠ 0, which has a filtering process in the vehicle i, the method for resampling the particles is as follows: when number of effective particles NeffLess than a predetermined threshold NthIs subjected to particle resampling and
Figure BDA00024786365300000513
firstly, (0, 1)]The particle weight value-taking interval is divided into M subintervals, and each subinterval is (lambda)m-1m]And is
Figure BDA00024786365300000514
Figure BDA00024786365300000515
Then randomly generating M in [0,1 ]]Upper evenly distributed values ul1,2, when ulFalls in the interval (lambda)m-1m]Then, will
Figure BDA00024786365300000516
As corresponding to new particle states
Figure BDA00024786365300000517
New particle weights
Figure BDA00024786365300000518
Set to 1/M.
Based on the technical scheme, the invention has the following beneficial technical effects:
1. the communication sensing asynchronous data fusion positioning method provided by the invention combines the advantages of the sensor and the communication method, has wide measurement range and strong environmental adaptability, and obviously improves the positioning accuracy of the neighbor vehicle.
2. The particle filter fusion method provided by the invention can perform filter fusion in real time each time a new measured value is obtained, has simple calculation and effective result, and solves the problem of multi-source asynchronous data fusion in collaborative neighbor vehicle positioning.
Drawings
Fig. 1 is a system scene schematic diagram of the cooperative neighbor vehicle positioning method of the present invention.
Fig. 2 is a schematic diagram of state information acquisition of the cooperative neighbor vehicle positioning method of the present invention.
FIG. 3 is a schematic diagram of a filtering fusion process of the cooperative neighbor vehicle positioning method of the present invention.
FIG. 4 is a diagram of a vehicle position estimation result of the cooperative neighbor vehicle positioning method of the present invention.
FIG. 5 is a diagram of a vehicle position estimation error result of the cooperative neighbor vehicle positioning method of the present invention.
FIG. 6 is a graph of the results of the mean square error of vehicle position estimation of the method of collaborative neighbor vehicle localization of the present invention influenced by the variance of GPS and sensor noise.
Detailed Description
In order to more specifically describe the present invention, the following detailed description is provided for the technical solution of the present invention with reference to the accompanying drawings and the specific embodiments.
The invention considers any vehicle running on any road, which periodically measures the state information of the vehicle and the neighbor vehicles, and then broadcasts the state measurement information of the vehicle by using V2V communication, wherein the measurement period of each vehicle is set as T, but the measurement time can be asynchronous. As shown in fig. 1, let the state of the neighboring vehicle j of the vehicle i at time t be:
Figure BDA0002478636530000061
wherein:
Figure BDA0002478636530000062
vj(t) and θj(t) represents the lateral position, longitudinal position, velocity and angle of the vehicle's direction of travel to the horizontal east direction of the vehicle j at time t, respectively. The discrete power model of vehicle j with period T can be modeled by the following non-linear system:
Figure BDA0002478636530000063
wherein:
Figure BDA0002478636530000064
represents the running speed noise of the vehicle j, and has a mean value of 0 and a variance of
Figure BDA0002478636530000065
(ii) a gaussian distribution of;
Figure BDA0002478636530000066
represents the running angle noise of the vehicle j, and has a mean value of 0 and a variance of
Figure BDA0002478636530000067
A gaussian distribution of (a).
The measured value of the vehicle j for the self state at the time t is:
Figure BDA0002478636530000071
wherein:
Figure BDA0002478636530000072
the self-measured Gaussian noise of the vehicle j at the moment t is represented, the mean value is 0, and the covariance matrix is
Figure BDA0002478636530000073
Component (b) of
Figure BDA0002478636530000074
And
Figure BDA0002478636530000075
representing the absolute position measurement noise in the lateral direction (i.e. the x-direction in figure 1) and in the longitudinal direction (i.e. the y-direction in figure 1) respectively,
Figure BDA0002478636530000076
in order to measure the noise in the speed,
Figure BDA0002478636530000077
for measuring noise at a driving angle, assuming that components are independent of each other; after the vehicle j detects the self-moving state, the vehicle j broadcasts the self-state to all the neighbor vehicles by using V2V communication.
On the other hand, the vehicle i can periodically measure the state of the neighbor vehicle j by using the sensor thereof to obtain the relative motion state measured value of the two vehicles:
Figure BDA0002478636530000078
wherein:
Figure BDA0002478636530000079
the mean value of the sensing measurement Gaussian noise representing the observation of the vehicle i to the vehicle j at the time t is 0, and the covariance matrix is
Figure BDA00024786365300000710
Component (b) of
Figure BDA00024786365300000711
And
Figure BDA00024786365300000712
respectively show a crossThe noise is sensed at positions in the direction (i.e. x-direction in figure 1) and the longitudinal direction (i.e. y-direction in figure 1),
Figure BDA00024786365300000713
in order for the speed to sense the noise,
Figure BDA00024786365300000714
sensing noise for a driving angle; assuming that the components are independent of each other, the motion state of the vehicle i with respect to the vehicle j is estimated as:
Figure BDA00024786365300000715
wherein:
Figure BDA00024786365300000716
representing a measured value of the state of the vehicle i itself,
Figure BDA00024786365300000717
representing the self-measured Gaussian noise of the vehicle i, satisfying the mean value of 0 and the covariance matrix of
Figure BDA00024786365300000718
Covariance matrix
Figure BDA00024786365300000719
Is/are as follows
Figure BDA00024786365300000720
Respectively set up as follows:
Figure BDA00024786365300000721
wherein:
Figure BDA0002478636530000081
and
Figure BDA0002478636530000082
the noise variance about the self transverse position, longitudinal position, speed and angle is respectively obtained for the vehicle i through GPS measurement,
Figure BDA0002478636530000083
and
Figure BDA0002478636530000084
the noise variance about the own lateral position, longitudinal position, speed and angle is obtained for the vehicle j through GPS measurement respectively,
Figure BDA0002478636530000085
and
Figure BDA0002478636530000086
the noise variances with respect to the lateral relative position, the longitudinal relative position, the relative speed and the relative angle of the vehicle j are respectively obtained for the vehicle i through sensor measurement.
Fig. 2 shows two types of state measurement information of the vehicle j obtained by the vehicle i, including communication reception and sensor measurement. Let vehicle i obtain the kth state measurement information of vehicle j as
Figure BDA0002478636530000087
Wherein k is a natural number, rk∈{g,s},rkG indicates that the information was received by communication, rkS represents that the information is measured by a sensor, and the acquisition time of the information is recorded as tk
After the vehicle acquires the neighbor vehicle state measurement information, it needs to perform asynchronous filtering fusion processing on the vehicle state measurement information to improve the accuracy of positioning estimation of the neighbor vehicle, and a specific vehicle processing flow is shown in fig. 3, which includes the following steps (taking the positioning of the vehicle i on the vehicle j as an example):
(1) vehicle i obtains state measurement information of neighbor vehicle j by using sensor and communication equipment
Figure BDA0002478636530000088
(2) Judging whether the vehicle i has a processing process of the vehicle j, if not, newly establishing the processing process, and counting the state measurement information of the vehicle j from k equal to 0;
(3) otherwise, judging tk-tk-1>If the aT is established (the positive integer a is a preset timeliness parameter), deleting the existing processing process of the vehicle j, newly establishing a process, and calculating the state measurement information of the vehicle j from k to 0 again;
(4) otherwise, estimating the positioning of the neighbor vehicle by using a communication sensing asynchronous data filtering fusion algorithm.
The communication sensing asynchronous data fusion algorithm is based on a sequential importance sampling particle filtering method, a posterior probability density function of a vehicle state is simulated through a series of particles, particle state prediction and weight updating are continuously carried out to correct a filtering process, and the purpose of asynchronous data fusion is achieved, wherein the algorithm comprises the following steps:
if k is equal to 0, initializing a filtering process:
when r iskWhen the number is equal to s,
Figure BDA0002478636530000089
thus Sj,0Satisfy the mean value of
Figure BDA00024786365300000810
The covariance matrix is
Figure BDA00024786365300000811
(ii) a gaussian distribution of; when r iskWhen the ratio is equal to g,
Figure BDA00024786365300000812
thus Sj,0Satisfy the mean value of
Figure BDA00024786365300000813
The covariance matrix is
Figure BDA00024786365300000814
A gaussian distribution of (a). At known Sj,0After the distribution of (a), m (m ═ 1, 2.M, M being the total number of particles) from which the particle states are randomly sampled
Figure BDA00024786365300000815
Weight of particles is set as
Figure BDA00024786365300000816
If k ≠ 0, the following iterative procedure is performed:
A. and (3) state prediction:
if rk-1=rkThe last time and the time information are obtained by vehicle i sensing measurement or communication reception, and t is satisfiedk-tk-1B is bT and b<a (b is a positive integer), and randomly sampling the motion speed and angle noise of the vehicle j from the noise distribution for each particle M (M is 1,2
Figure BDA0002478636530000091
And
Figure BDA0002478636530000092
combined with the state of the particle at the previous moment
Figure BDA0002478636530000093
T is obtained by predicting the sum power model formula (2)k-1Vehicle motion state S at time + Tj(tk-1+ T), so as to obtain TkParticle state at time of day
Figure BDA0002478636530000094
If rk-1≠rkThe information of the previous moment and the moment is different types of measurement information, and t is satisfiedk-tk-1B is bT + T and b<a, wherein τ<And T represents the time difference of information acquisition of the vehicle i and the vehicle j. For each particle m, the state of the particle is determined according to the previous moment
Figure BDA0002478636530000095
And power model step-by-step predictiontk-1+ bT and tk-1State of motion S of the vehicle at time + 1Tj(tk-1+ bT) and Sj(tk-1T) (b +1), then TkThe predicted state of the particle m at the moment can be obtained by a smoothing method, namely:
Figure BDA0002478636530000096
B. and (3) updating the weight:
updating the particle weight according to the state observation value of the vehicle j at the current moment, and judging r firstkIf r iskFor each particle m, the state of the particle is known
Figure BDA0002478636530000097
And a measured value of state
Figure BDA0002478636530000098
The probability density function p (Y)i,j|Sj) In that
Figure BDA0002478636530000099
Value of (A)
Figure BDA00024786365300000910
Is that
Figure BDA00024786365300000911
Has a probability density function of
Figure BDA00024786365300000912
The value of (b) is then according to the formula
Figure BDA00024786365300000913
Updating the particle weight; if rkG, the probability density function p (Y)i,j|Sj) At the point of
Figure BDA00024786365300000914
Value of (A)
Figure BDA00024786365300000915
Is that
Figure BDA00024786365300000916
Has a probability density function of
Figure BDA00024786365300000917
The value of (b) is then according to the formula
Figure BDA00024786365300000918
And updating the particle weight.
C. And (3) state estimation:
according to
Figure BDA00024786365300000919
To obtain tkEstimate of the state of motion of the vehicle j at a time, wherein
Figure BDA00024786365300000920
D. Resampling:
the basic idea of the resampling technology is to copy the particles with large weight, eliminate the particles with small weight, and keep the total number of the particles unchanged. Definition of
Figure BDA0002478636530000101
Is effective particle number, and is used for measuring degradation degree of particle weight, when the effective particle number NeffLess than a predetermined threshold NthResampling is carried out; first, interval (0, 1)]Divided into M intervals I according to normalized particle weightm=(λm-1m]Wherein
Figure BDA0002478636530000102
Then randomly generating M in [0,1 ]]Upper evenly distributed values u l1,2, …, M; when u islFalls within the interval ImWhen it is time to replicate the particle state
Figure BDA0002478636530000103
As new particle states
Figure BDA0002478636530000104
Weight of
Figure BDA0002478636530000105
Is arranged as
Figure BDA0002478636530000106
The technical scheme of the invention has the beneficial effects that the simulation can be used for verification, running tracks of 9000 vehicles on a partial regional map of Borogna (Borogna) in Italy are simulated by adopting SUMO software, and information such as the position, the speed, the running angle and the like of each vehicle is recorded every 0.01 s; then, vehicle track information is led into NS-3 software, periodic wireless communication between vehicles in the motion process is simulated, an IEEE 802.11p protocol is used as a wireless communication Medium Access Control (MAC) layer protocol, a logarithmic distance fading model is adopted to simulate a wireless channel, the vehicle transmitting power is set to be 16dBm, the receiving threshold is set to be-96 dBm, and the corresponding communication range is set to be 150 m.
Randomly selecting a certain vehicle, applying the proposed algorithm to perform filtering fusion according to information obtained by the vehicle communication simulation and sensing, setting the particle number M to be 100, and setting an effective particle number threshold NthSet phi for any vehicle i at 30i d,v=0.5m2/s2i d,θ=0.01rad2
Figure BDA0002478636530000107
Figure BDA0002478636530000108
Setting for any vehicle i when observing any neighbor vehicle j
Figure BDA0002478636530000109
Figure BDA00024786365300001010
i, j ∈ {1,2, …,9000} is the vehicle ID in the simulation.
Fig. 4 shows the results of the trajectory estimation of a certain vehicle after filtering by our proposed algorithm, and it can be seen from fig. 4 that the results of our estimation will fluctuate around the actual trajectory, but will be significantly closer to the actual motion trajectory than the sensor measurements.
For comparison, fig. 5 shows the time-varying error between the vehicle position estimated by the sensing measurement and particle filter fusion algorithm in fig. 4 and the actual vehicle position, and since our position error index is calculated as the distance from the actual vehicle position, the position error value in fig. 5 is positive. In addition, it can be seen from FIG. 5 that the position error of the sensing measurement is significantly larger than the error of the algorithm evaluation; statistically, the error of the position estimation after the algorithm provided by the invention is reduced by 60% for the vehicle.
FIG. 6 is a plot of the mean square error of positioning of the vehicle as a function of sensor noise throughout the observation shown in FIG. 4, where the abscissa represents the variance of the position noise measured by the sensor
Figure BDA00024786365300001011
The ordinate represents the mean square error of the vehicle position throughout the observation. In order to obtain the result of the noise characteristic change of the sensor, the vehicle positioning under different noise characteristics is repeatedly tested, and four lines in fig. 6 respectively represent the noise variance measured at the GPS position
Figure BDA0002478636530000111
Is 0.5m2And 1m2The results of the sensor measurements and the results of the algorithm evaluations in the case of (a); as can be seen from FIG. 6, the variance of the noise measured at the sensor location
Figure BDA0002478636530000112
When smaller, the sensor measurement and algorithm evaluation are not very different; but the variance of the noise as measured by the sensor position
Figure BDA0002478636530000113
The mean square error of the measurement result of the sensor is obviously increased, and the mean square error of the evaluation result of the algorithm is relatively slowly increased, so that the difference between the two is gradually increased. Noise variance when a sensor measures a position measurement
Figure BDA0002478636530000114
In time, the error of the measurement result of the sensor reaches about 8 times of the error of the evaluation result of the algorithm; in addition, the noise variance of different GPS position measurement is compared
Figure BDA0002478636530000115
The following result shows that the noise variance of the GPS position measurement also has a certain influence on the result of the algorithm evaluation, and the smaller the noise variance of the GPS position measurement is, the smaller the mean square error of the result of the algorithm evaluation is, i.e., the more accurate the positioning is.
The algorithmic descriptions above are presented to facilitate one of ordinary skill in the art to understand and practice the present invention. It will be readily apparent to those skilled in the art that various modifications can be made to the above-described algorithms and the generic principles described herein may be applied to other embodiments without the use of the inventive faculty. Therefore, the present invention is not limited to the above-described algorithm, and those skilled in the art should make improvements and modifications to the present invention based on the disclosure of the present invention within the protection scope of the present invention.

Claims (1)

1. A cooperative neighbor vehicle positioning method based on communication sensing asynchronous data fusion comprises the following steps:
(1) periodically measuring motion state information including positions and speeds of the self vehicle and the neighbor vehicles, broadcasting the self state information to all the neighbor vehicles through V2V, and receiving state measurement information broadcasted by the communication of the neighbor vehicles;
if the own vehicle is marked as i and any neighbor vehicle is marked as j, the motion states of the vehicles i and j at the time t are expressed as follows:
Figure FDA0003132337930000011
Figure FDA0003132337930000012
wherein: si(t) and Sj(t) the motion states of the vehicles i and j at the time t respectively,
Figure FDA0003132337930000013
and
Figure FDA0003132337930000014
the lateral positions of vehicles i and j at time t respectively,
Figure FDA0003132337930000015
and
Figure FDA0003132337930000016
longitudinal positions v of vehicles i and j, respectively, at time ti(t) and vj(t) the speed of the vehicles i and j at time t, θi(t) and θj(T) the included angles between the driving directions of the vehicles i and j at the moment T and the direction of the east, wherein T represents transposition, and T is a non-negative real number;
if the k state measurement information of the neighbor vehicle j obtained by the self vehicle i is expressed as
Figure FDA0003132337930000017
k is a natural number, rkE.g. { s, g }, when rkS means that the information is measured by a sensor of vehicle i and
Figure FDA0003132337930000018
Figure FDA0003132337930000019
Sj,k=Sj(tk),Sj(tk) Corresponding t to vehicle jkThe state of motion at the moment in time,
Figure FDA00031323379300000110
the error is measured for the state of the vehicle i itself,
Figure FDA00031323379300000111
sensing measurement error of the vehicle j for the vehicle i; when r iskG indicates that the information is obtained by vehicle i communication reception and
Figure FDA00031323379300000112
Figure FDA00031323379300000113
measuring error for the state of vehicle j itself;
(2) extracting a vehicle ID, information acquisition time, a position and a speed from the obtained neighbor vehicle information, and judging whether a new filtering process needs to be established for the neighbor vehicle according to the neighbor vehicle ID and the information acquisition time; for tkObtaining kth state measurement information of neighbor vehicle j at any moment
Figure FDA00031323379300000114
Judging whether the neighbor vehicle is a new neighbor vehicle according to the ID of the neighbor vehicle j, if so, newly establishing a filtering process for the neighbor vehicle, otherwise, explaining that the filtering process of the neighbor vehicle exists in the own vehicle i, and further acquiring time t according to the obtained last state measurement information of the own vehicle ik-1To judge
Figure FDA00031323379300000115
The aging property of (2): if tk-tk-1If the time interval is longer than aT, deleting the original filtering process and building a new filtering process for the neighbor vehicle, otherwise, continuously processing the original filtering process, wherein T is a sensor measurement period, and a is a preset timeliness parameter and a positive integer;
(3) positioning estimation is carried out on the neighbor vehicle by adopting a communication sensing asynchronous data fusion algorithm, namely, the particle state at the current moment is predicted according to the particle state at the last moment in the filtering process, then the obtained neighbor vehicle state measurement information is utilized to carry out particle weight updating, and finally the estimation of the motion state of the neighbor vehicle is completed and the particles are resampled;
under the condition that a neighbor vehicle j, namely k, needing to newly build a filtering process in a vehicle i is 0, when r is used in the communication sensing asynchronous data fusion algorithmkWhen the number is equal to s,
Figure FDA0003132337930000021
Sj,0satisfy the mean value of
Figure FDA0003132337930000022
The covariance matrix is
Figure FDA0003132337930000023
(ii) a gaussian distribution of; when r iskWhen the ratio is equal to g,
Figure FDA0003132337930000024
Sj,0satisfy the mean value of
Figure FDA0003132337930000025
The covariance matrix is
Figure FDA0003132337930000026
(ii) a gaussian distribution of; and then at known Sj,0After the distribution, each particle state is randomly sampled from the distribution
Figure FDA0003132337930000027
Each particle is weighted to
Figure FDA0003132337930000028
M1, 2, M is the number of particles;
the covariance matrix
Figure FDA0003132337930000029
Is/are as follows
Figure FDA00031323379300000210
Respectively set up as follows:
Figure FDA00031323379300000211
wherein:
Figure FDA00031323379300000212
and
Figure FDA00031323379300000213
the noise variance about the self transverse position, longitudinal position, speed and angle is respectively obtained for the vehicle i through GPS measurement,
Figure FDA00031323379300000214
and
Figure FDA00031323379300000215
the noise variance about the own lateral position, longitudinal position, speed and angle is obtained for the vehicle j through GPS measurement respectively,
Figure FDA00031323379300000216
and
Figure FDA00031323379300000217
measuring the noise variances of the vehicle i through sensors to obtain the transverse relative position, the longitudinal relative position, the relative speed and the relative angle of the vehicle j;
the method for predicting the particle state at the current moment according to the particle state at the previous moment in the filtering process by the communication sensing asynchronous data fusion algorithm under the condition that the neighbor vehicle j, namely k, of the existing filtering process in the vehicle i is not equal to 0 comprises the following steps: if rk-1=rkI.e. tk-1Time t andkthe state measurement information of the neighbor vehicle j at the moment is obtained in the same way for the vehicle i, and t is satisfiedk-tk-1bT and b is a positive integer less than a; for each particle, randomly sampling from the noise distribution to obtain the value of the vehicle j at tk-1Temporal motion velocity noise
Figure FDA00031323379300000218
And angle of motion noise
Figure FDA00031323379300000219
And then combine tk-1State of each particle at a time
Figure FDA00031323379300000220
And the discrete nonlinear dynamic model is predicted to obtain tk-1Motion state S of vehicle j at + T timej(tk-1+ T), so as to obtain TkState of each particle at a time
Figure FDA00031323379300000221
M1, 2, M is the number of particles;
if rk-1≠rkI.e. tk-1Time t andkthe state measurement information of the neighbor vehicle j at the moment is obtained by different modes for the vehicle i, and t is satisfiedk-tk-1T + τ, τ being the information acquisition time difference between vehicles i and j, τ < T; for each particle, according to tk-1State of each particle at a time
Figure FDA0003132337930000031
And the discrete nonlinear dynamic model gradually predicts to obtain tk-1+ bT time and tk-1Motion state S of vehicle j at time + (b +1) Tj(tk-1+ bT) and Sj(tk-1T) (b +1), then TkTime of day per particle state
Figure FDA0003132337930000032
By passingThe following smoothing algorithm yields:
Figure FDA0003132337930000033
wherein: α τ/T; the discrete nonlinear dynamical model expression is as follows:
Figure FDA0003132337930000034
Figure FDA0003132337930000035
Figure FDA0003132337930000036
Figure FDA0003132337930000037
wherein:
Figure FDA0003132337930000038
the lateral position of vehicle j at time T + T,
Figure FDA0003132337930000039
is the longitudinal position, v, of vehicle j at time T + Tj(T + T) is the speed of vehicle j at time T + T, θj(T + T) is the included angle between the driving direction of the vehicle j and the east-ward direction at the moment T + T,
Figure FDA00031323379300000310
and
Figure FDA00031323379300000311
respectively the moving speed noise of the vehicle j at the moment tAcoustic and motion angle noise;
the method for updating the particle weight by using the obtained neighbor vehicle state measurement information comprises the following steps: for each particle, the particle weight is updated by the following formula;
Figure FDA00031323379300000312
wherein:
Figure FDA00031323379300000313
and
Figure FDA00031323379300000314
are each tk-1Time t andkweight of each particle at a time, when rkWhen the number is equal to s,
Figure FDA00031323379300000315
is composed of
Figure FDA00031323379300000316
Has a probability density function of
Figure FDA00031323379300000317
The value of (d); when r iskWhen the ratio is equal to g,
Figure FDA00031323379300000318
is composed of
Figure FDA00031323379300000319
Has a probability density function of
Figure FDA00031323379300000320
The value of (d);
calculating a motion state estimation value of the neighbor vehicle j through the following formula;
Figure FDA00031323379300000321
Figure FDA00031323379300000322
wherein:
Figure FDA00031323379300000323
is tkEstimating the motion state of the neighbor vehicle j at the moment;
the method for resampling the particles comprises the following steps: when number of effective particles NeffLess than a predetermined threshold NthIs subjected to particle resampling and
Figure FDA00031323379300000324
firstly, (0, 1)]The particle weight value-taking interval is divided into M subintervals, and each subinterval is (lambda)m-1m]And is
Figure FDA0003132337930000041
Then randomly generating M in [0,1 ]]Upper evenly distributed values ul1,2, when ulFalls in the interval (lambda)m-1m]Then, will
Figure FDA0003132337930000042
As corresponding to new particle states
Figure FDA0003132337930000043
New particle weights
Figure FDA0003132337930000044
Set to 1/M.
CN202010372427.XA 2020-05-06 2020-05-06 Cooperative neighbor vehicle positioning method based on communication sensing asynchronous data fusion Active CN111586632B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010372427.XA CN111586632B (en) 2020-05-06 2020-05-06 Cooperative neighbor vehicle positioning method based on communication sensing asynchronous data fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010372427.XA CN111586632B (en) 2020-05-06 2020-05-06 Cooperative neighbor vehicle positioning method based on communication sensing asynchronous data fusion

Publications (2)

Publication Number Publication Date
CN111586632A CN111586632A (en) 2020-08-25
CN111586632B true CN111586632B (en) 2021-09-07

Family

ID=72111962

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010372427.XA Active CN111586632B (en) 2020-05-06 2020-05-06 Cooperative neighbor vehicle positioning method based on communication sensing asynchronous data fusion

Country Status (1)

Country Link
CN (1) CN111586632B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112880699B (en) * 2021-01-19 2023-03-10 中国人民解放军空军工程大学 Vehicle cooperative positioning method based on brain selective attention mechanism
CN115061176B (en) * 2022-08-05 2022-12-06 合肥工业大学 Vehicle GPS enhanced positioning method based on V2V instantaneous data exchange

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104200695A (en) * 2014-08-15 2014-12-10 北京航空航天大学 Vehicle co-location method based on special short range communication for vehicular access
CN107315413A (en) * 2017-07-12 2017-11-03 北京航空航天大学 Under a kind of truck traffic environment consider vehicle between relative position many car co-located algorithms
CN110631593A (en) * 2019-11-25 2019-12-31 奥特酷智能科技(南京)有限公司 Multi-sensor fusion positioning method for automatic driving scene

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2555017B1 (en) * 2011-08-03 2017-10-04 Harman Becker Automotive Systems GmbH Vehicle navigation on the basis of satellite positioning data and vehicle sensor data

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104200695A (en) * 2014-08-15 2014-12-10 北京航空航天大学 Vehicle co-location method based on special short range communication for vehicular access
CN107315413A (en) * 2017-07-12 2017-11-03 北京航空航天大学 Under a kind of truck traffic environment consider vehicle between relative position many car co-located algorithms
CN110631593A (en) * 2019-11-25 2019-12-31 奥特酷智能科技(南京)有限公司 Multi-sensor fusion positioning method for automatic driving scene

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
CNVPS: Cooperative Neighboring Vehicle Positioning System Based on Vehicle-to-Vehicle Communication;SANGHYUCK NAM等;《IEEE ACESS》;20190124;全文 *
Improving GPS-Based Vehicle Positioning for Intelligent Transportation Systems;Arghavan Amini等;《2014 IEEE Intelligent Vehicles Symposium (IV)》;20140611;全文 *
多传感器噪声方差未知情况下的异步航迹融合;赵威等;《传感技术学报》;20081231;第21卷(第12期);全文 *
考虑定位信息不确定性的多车协同定位算法;鲁光泉等;《交通信息与安全》;20180531;第36卷(第5期);全文 *

Also Published As

Publication number Publication date
CN111586632A (en) 2020-08-25

Similar Documents

Publication Publication Date Title
Liu et al. Deeplora: Learning accurate path loss model for long distance links in lpwan
CN107516417B (en) A kind of real-time highway flow estimation method for excavating spatial and temporal association
CN111586632B (en) Cooperative neighbor vehicle positioning method based on communication sensing asynchronous data fusion
CN109275121B (en) Vehicle trajectory tracking method based on adaptive extended Kalman filtering
CN108171993B (en) Highway vehicle speed calculation method based on mobile phone signaling big data
CN110446160B (en) Deep learning method for vehicle position estimation based on multipath channel state information
CN104581943B (en) Node positioning method for Distributed Wireless Sensor Networks
CN112711055B (en) Indoor and outdoor seamless positioning system and method based on edge calculation
CN109932758B (en) Advection fog forecasting system and forecasting method
CN110956146B (en) Road background modeling method and device, electronic equipment and storage medium
EP4081835B1 (en) Methods, apparatuses, systems and computer program products for estimating road surface temperature
CN104936147A (en) Positioning method based on building layout constraint under complex indoor environment
CN109190811A (en) A kind of car speed tracking based on adaptive extended kalman filtering
CN112530177B (en) Kalman filtering-based vehicle queuing length estimation method in Internet of vehicles environment
CN105759274A (en) Typhoon attention area radar rainfall estimation method
CN112543471A (en) Complex environment-oriented mobile 5G hybrid access link interruption prediction method
Akhtar et al. Analysis of distributed algorithms for density estimation in vanets (poster)
CN103605960A (en) Traffic state identification method based on fusion of video images with different focal lengths
CN110087280B (en) Vehicle density estimation method based on beacon message
CN117029840A (en) Mobile vehicle positioning method and system
CN115100847B (en) Queuing service time estimation method for low-permeability network-connected track data
Raiyn Classification of road traffic anomaly based on travel data analysis
CN114916059B (en) WiFi fingerprint sparse map extension method based on interval random logarithmic shadow model
CN111199646B (en) Urban signal control main road vehicle track reconstruction method based on sparse detection data
CN107590509B (en) Cherenov fusion method based on maximum expectation approximation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant