CN113837237B - Multi-sensor fusion target identification method based on evidence confidence entropy and similarity - Google Patents

Multi-sensor fusion target identification method based on evidence confidence entropy and similarity Download PDF

Info

Publication number
CN113837237B
CN113837237B CN202111022948.3A CN202111022948A CN113837237B CN 113837237 B CN113837237 B CN 113837237B CN 202111022948 A CN202111022948 A CN 202111022948A CN 113837237 B CN113837237 B CN 113837237B
Authority
CN
China
Prior art keywords
target
current target
similarity
evidences
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111022948.3A
Other languages
Chinese (zh)
Other versions
CN113837237A (en
Inventor
李枭扬
杨振
袁展翅
周德云
闵令通
侍佼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN202111022948.3A priority Critical patent/CN113837237B/en
Publication of CN113837237A publication Critical patent/CN113837237A/en
Application granted granted Critical
Publication of CN113837237B publication Critical patent/CN113837237B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a multi-sensor fusion target identification method based on evidence confidence entropy and similarity, which comprises the following steps: acquiring a plurality of preset types to which a target belongs; each sensor determines a first probability that a current target belongs to each preset type, and a plurality of target identification evidences of the current target are obtained; each sensor calculates the similarity and uncertainty of a plurality of target recognition evidences of the current target, and determines uncertainty weight and similarity weight; weighting the first probability according to the uncertainty weight and the similarity weight to obtain second probabilities that the current target belongs to each preset type; after the second probability that the current target corresponding to each sensor belongs to each preset type is obtained, K times of fusion are conducted on the N sensors, and the (k+1) sensors are selected according to a preset sequence during the K times of fusion, and the preset type of the current target is determined. For the problem of target identification of multi-sensor multi-source information fusion, the method and the device effectively improve the accuracy of target identification.

Description

Multi-sensor fusion target identification method based on evidence confidence entropy and similarity
Technical Field
The invention belongs to the technical field of information fusion, and particularly relates to a multi-sensor fusion target identification method based on evidence confidence entropy and similarity.
Background
The problem of aerial target identification is a process of judging the specific identity or type of the target of the opposite party, but the existence of certain factors leads to inconsistent target type identification results obtained by different sensor platforms. In recent years, the accuracy of target recognition results is improved by using a data fusion algorithm, but because different target identity type recognition results obtained by different sensor platforms are different, if target recognition evidences with non-uniformity problems are directly fused by using the data fusion algorithm, fusion results which are contrary to the real situation are obtained, so that research on the target recognition method under the non-uniformity evidences has important significance for improving the accuracy of target type recognition.
Currently, D-S evidence theory (Dempster-Shafer Evidence Theory, DSET) is widely used in processing uncertainty information as a method by which uncertainty data can be represented. Literature (c.e. shannon.a Mathematical Theory of communication. Acm signaling mobile Mob J. Comput. Com. Rev 2001,5 (1): 3-55.) mentions shannon entropy as a common means by which the provided data can be statistically chaotic, however, in the DSET framework, not only a single element is present, and shannon entropy is therefore not suitable for use under the DSET framework.
In order to solve the confusing representation of information under the DSET framework, literature (G.J.Klir, M.J.Wierman.Uncertainty-based Information: elements of Generalized Information Theory [ M ]. Physica, 2013.) proposes five properties of measurement confidence entropy for measuring evidence uncertainty under the DSET framework, and the evidence uncertainty measurement method under the DSET framework can be defined more standard by using the standard; the literature (l.pan, y.deng.a New Belief Entropy to Measure Uncertainty of Basic Probability Assignments Based on Belief Function and Plausibility Function [ J ]. Entropy,2018,20 (11): 842) proposes a new basic probability assignment uncertainty measure based on Deng Entropy, which introduces intervals into the problem; the literature (R.Jirousek, P.P.Shenoy.A New Defifinition of Entropy of Belief Functions in the Dempster-shafer Theory [ J ]. International Journal of Approximate Reasoning,2018, 92:49-65.) analyzes evidence composition under the DSET framework on five property conditions proposed by the former for measuring confidence entropy, expands the five properties, and provides a method for calculating evidence confusion.
However, in the related art, the D-S evidence theory has a certain advantage in solving the partial uncertainty problem, but when there is inconsistency between the evidences, the result obtained according to the fusion rule thereof often contradicts the correct result, shannon entropy is not applicable under the DSET framework, because in the DSET framework, not only a single element exists, but most of the confidence entropy models are concentrated on the value of the basic probability or the base of each basic probability, the influence of the recognition framework is completely ignored, and these definitions cannot measure the degree of uncertainty under different recognition frameworks.
Disclosure of Invention
In order to solve the problems in the prior art, the invention provides a multi-sensor fusion target identification method based on evidence confidence entropy and similarity. The technical problems to be solved by the invention are realized by the following technical scheme:
the invention provides a multi-sensor fusion target identification method based on evidence confidence entropy and similarity, which comprises the following steps:
determining a plurality of targets to be identified, and acquiring a plurality of preset types to which the targets belong;
each sensor determines a first probability that a current target belongs to each preset type, and a plurality of target identification evidences of the current target are obtained;
for each sensor, calculating the similarity and uncertainty of a plurality of target recognition evidences of the current target, and determining the uncertainty weight and similarity weight of each target recognition evidence;
weighting the first probability according to the uncertainty weight and the similarity weight to obtain second probabilities that the current target belongs to each preset type;
after obtaining the second probability that the current target corresponding to each sensor belongs to each preset type, K times of fusion are carried out on N sensors, and (k+1) sensors are selected according to a preset sequence during the K times of fusion, and the preset type of the current target is determined according to the obtained K fusion results, wherein K=N-1.
In one embodiment of the present invention, the step of calculating, for each sensor, the similarity and uncertainty of a plurality of target recognition evidences of the current target, and determining an uncertainty weight and similarity weight of each of the target recognition evidences, includes:
determining uncertainty of a plurality of target recognition evidences of the current target by using a preset confidence entropy model;
determining conflict degrees of a plurality of target recognition evidences of the current target, and determining similarity among the plurality of target recognition evidences of the current target according to the conflict degrees;
and determining the uncertainty weight and the similarity weight of each target recognition evidence according to the uncertainty and the similarity of a plurality of target recognition evidences of the current target.
In one embodiment of the present invention, the step of determining a degree of conflict of the multiple target recognition evidences of the current target and determining a similarity between the multiple target recognition evidences of the current target according to the degree of conflict includes:
determining the conflict degree among a plurality of target recognition evidences of the current target by utilizing a Minkowski distance to obtain a conflict degree coefficient among the target recognition evidences;
according to the conflict degree coefficient between the target recognition evidences, calculating the similarity between the multiple target recognition evidences of the current target according to the following formula:
sim i,j =1-MDismP(m i ,m j )
wherein MDismP (m i ,m j ) A coefficient of degree of conflict between the ith target recognition evidence and the jth target recognition evidence corresponding to the current target is represented, sim i,j And representing the similarity between the ith target recognition evidence corresponding to the current target and the jth target recognition evidence.
In one embodiment of the present invention, each sensor determines a first probability that a current target belongs to each preset type, and the step of obtaining a plurality of target recognition evidences of the current target includes:
and each sensor determines the first probability that the current target belongs to each preset type by using a mass function.
In one embodiment of the invention, the degree of conflict coefficient between the multiple target recognition evidences of the current target is determined according to the following formula:
wherein,a represents all preset types, A t Represents the t-th preset type, |A| represents the number of preset types, m (·) represents the mass function, |represents the preset recognition frame, and P (Θ) represents the mapping to [0,1 ] via the mass function],K(m i ,m j ) Representing the collision coefficients in the Dempster combining rule.
In one embodiment of the present invention, the preset confidence entropy model is:
wherein A represents one of a plurality of preset types to which the target belongs,representing the sum of all likelihood functions containing X in A, |X| represents the cardinality of a preset target recognition frame.
In one embodiment of the present invention, the step of weighting the first probability according to the uncertainty weight and the similarity weight to obtain a second probability that the current target belongs to each preset type includes:
and each sensor determines a first weight of each target identification evidence of the current target according to the uncertainty weight and the similarity weight, and multiplies the first weight with a first probability that the current target belongs to each preset type to obtain a second probability that the current target belongs to each preset type.
In one embodiment of the present invention, after obtaining a second probability that a current target corresponding to each sensor belongs to each preset type, performing K-time fusion on N sensors, selecting (k+1) sensors according to a preset sequence during the K-time fusion, and determining, according to the obtained K fusion results, the preset type to which the current target belongs, where the step includes:
acquiring a second probability that a current target corresponding to each sensor belongs to each preset type;
at the kth fusion, selecting 1 st, 2 nd and … … (k+1) th sensors from the N sensors;
acquiring a plurality of second probabilities of the current target belonging to the same preset type in the 1 st, 2 nd, … … nd and (k+1) th sensors, and fusing the plurality of second probabilities to obtain a fusion result;
after K times of fusion, K fusion results are obtained, and the preset category of the current target is determined according to non-zero values in the K fusion results, wherein K=N-1.
In one embodiment of the invention, the plurality of second probabilities are fused according to the following formula:
wherein m is 1 、m 2 、…m n Representing mass functions, m, respectively corresponding to the selected n sensors 1 (A 1 )·m 2 (A 2 )…m n (A n ) Representing the product of the first probabilities that each sensor determines that the current target belongs to a certain preset type; wherein N is more than 1 and less than or equal to N,
compared with the prior art, the invention has the beneficial effects that:
the invention provides a multi-sensor fusion target recognition method based on evidence confidence entropy and similarity, which comprises the steps that after a plurality of target recognition evidences of a current target are obtained by each sensor, similarity and uncertainty of the plurality of target recognition evidences of the current target are calculated, uncertainty weight and similarity weight of each target recognition evidence are determined, the first probability is weighted according to the uncertainty weight and the similarity weight, the second probability that the current target belongs to each preset type is obtained, and then K times of fusion is carried out on N sensors after the second probability that the current target corresponding to each sensor belongs to each preset type is obtained, so that a target recognition result is determined.
The present invention will be described in further detail with reference to the accompanying drawings and examples.
Drawings
FIG. 1 is a schematic flow chart of a multi-sensor fusion target recognition method based on evidence confidence entropy and similarity provided by an embodiment of the invention;
FIG. 2 is another flow chart of a multi-sensor fusion target recognition method based on evidence confidence entropy and similarity provided by an embodiment of the present invention;
fig. 3 is another flow chart of a multi-sensor fusion target recognition method based on evidence confidence entropy and similarity according to an embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to specific examples, but embodiments of the present invention are not limited thereto.
Fig. 1 is a schematic flow chart of a multi-sensor fusion target recognition method based on evidence confidence entropy and similarity according to an embodiment of the present invention. Referring to fig. 1, an embodiment of the present invention provides a multi-sensor fusion target recognition method based on evidence confidence entropy and similarity, including:
s1, determining a plurality of targets to be identified, and acquiring a plurality of preset types to which the targets belong;
s2, each sensor determines a first probability that a current target belongs to each preset type, and a plurality of target identification evidences of the current target are obtained;
s3, aiming at each sensor, calculating the similarity and uncertainty of a plurality of target recognition evidences of the current target, and determining the uncertainty weight and similarity weight of each target recognition evidence;
s4, weighting the first probability according to the uncertainty weight and the similarity weight to obtain second probabilities that the current target belongs to each preset type;
s5, after obtaining the second probability that the current target corresponding to each sensor belongs to each preset type, K times of fusion are carried out on the N sensors, and k+1 sensors are selected according to a preset sequence during the K times of fusion, and the preset type of the current target is determined according to the obtained K fusion results, wherein K=N-1.
Specifically, the object recognition method provided in the present embodiment is based on the DEST framework. Firstly, acquiring a plurality of preset types of targets to be identified, and then determining the first probability that the current target belongs to each preset type by each sensor to serve as a plurality of target identification evidences of the current target. For example, the preset types are 4, and then each sensor respectively determines the first probability that the current target belongs to the four preset types; wherein each sensor can determine a first probability that the current target belongs to each preset type by using a mass function.
Optionally, in the step S3, for each sensor, a step of calculating similarity and uncertainty of a plurality of target recognition evidences of the current target, and determining uncertainty weight and similarity weight of each target recognition evidence includes:
s301, determining uncertainty of a plurality of target recognition evidences of a current target by using a preset confidence entropy model;
s302, determining conflict degrees of a plurality of target recognition evidences of the current target, and determining similarity among the plurality of target recognition evidences of the current target according to the conflict degrees;
s303, determining uncertainty weight and similarity weight of each target recognition evidence according to the uncertainty and similarity of a plurality of target recognition evidences of the current target.
In this embodiment, the uncertainty of the multiple target recognition evidences of the current target is determined by using the confidence entropy model, it should be understood that the greater the information amount of the target recognition evidence is, the greater the confidence entropy is, and thus the greater the uncertainty is, and therefore the smaller the weight should be given to the target recognition evidence.
Specifically, the preset confidence entropy model may be:
wherein A represents one of a plurality of preset types to which the target belongs,representing the sum of all likelihood functions containing X in A, |X| represents the cardinality of a preset target recognition frame.
Further, in step S302, the step of determining the degree of collision of the multiple pieces of target recognition evidence of the current target, and determining the similarity between the multiple pieces of target recognition evidence of the current target according to the degree of collision includes:
determining the conflict degree among a plurality of target recognition evidences of the current target by utilizing the Minkowski distance to obtain a conflict degree coefficient among the target recognition evidences;
and calculating the similarity among the multiple target recognition evidences of the current target according to the following formula according to the conflict degree coefficient among the target recognition evidences.
Specifically, the embodiment solves the above-mentioned conflict degree coefficient and similarity by using Hamacher-T-con fusion rule:
wherein:
the degree of conflict coefficient between the multiple target recognition evidences of the current target is as follows:
wherein,a represents all preset types, A t Represents the t-th preset type, |A| represents the number of preset types, m (·) represents the mass function, |represents the preset recognition frame, and P (Θ) represents the mapping to [0,1 ] via the mass function],K(m i ,m j ) Representing the collision coefficients in the Dempster combining rule.
It should be noted that, in this embodiment, the distance value between each target recognition evidence is determined by using the Minkowski distance to measure the degree of conflict between the target recognition evidences, where the degree of conflict is inversely proportional to the distance value, and the greater the coefficient of degree of conflict, the smaller the similarity between the two target recognition evidences.
Further, the similarity between the multiple target recognition evidences of the current target is:
sim i,j =1-MDismP(m i ,m j )
wherein MDismP (m i ,m j ) A coefficient of degree of conflict between the ith target recognition evidence and the jth target recognition evidence corresponding to the current target is represented, sim i,j And representing the similarity between the ith target recognition evidence corresponding to the current target and the jth target recognition evidence.
Definition matrix S N×N
In the formula, sim i,j =sim j,i 1, n represents the number of sensors.
Solving for W dis
λ max ·W dis =S N×N ·W dis
Wherein lambda is max Is S N×N Is the maximum eigenvalue of (c).
The similarity weight definition and uncertainty weight are respectively:
optionally, in the step S4, the step of weighting the first probability according to the uncertainty weight and the similarity weight to obtain the second probability that the current target belongs to each preset type includes:
and each sensor determines the first weight of each target identification evidence of the current target according to the uncertainty weight and the similarity weight, and multiplies the first weight with the first probability that the current target belongs to each preset type to obtain the second probability that the current target belongs to each preset type.
Specifically, after determining the uncertainty weight and the similarity weight of each target recognition evidence, determining the first weight of the target recognition evidence according to the following formula:
and multiplying the first weight corresponding to the target recognition evidence with the first probability that the current target belongs to each preset type respectively to obtain the second probability that the current target belongs to each preset type.
Optionally, in the step S5, after obtaining the second probability that the current target corresponding to each sensor belongs to each preset type, K times of fusion are performed on the N sensors, and (k+1) sensors are selected according to a preset sequence during the K times of fusion, and the preset type to which the current target belongs is determined according to the obtained K fusion results, where the step of k=n-1 includes:
s501, obtaining a second probability that a current target corresponding to each sensor belongs to each preset type;
s502, selecting 1 st, 2 nd and … … th (k+1) sensors from N sensors in the kth fusion;
s503, acquiring a plurality of second probabilities of the current target belonging to the same preset type in the 1 st, 2 nd, … … nd and (k+1) th sensors, and fusing the plurality of second probabilities to obtain a fusion result;
s504, after K times of fusion are carried out, K fusion results are obtained, and the preset category of the current target is determined according to non-zero values in the K fusion results, wherein K=N-1.
Specifically, the present embodiment fuses the second probabilities based on the Dempster combining rule. Taking the number of the sensors as N, the preset types including S1, S2, S3 and S4 as an example, selecting the 1 st and 2 nd sensors in the first fusion process, respectively obtaining the second probability that the current target belongs to the preset type S1 in the 1 st sensor and the second probability that the current target belongs to the preset type S1 in the 2 nd sensor and multiplying the second probability that the current target belongs to the preset type S2 in the 1 st sensor and the second probability that the current target belongs to the preset type S2 in the 2 nd sensor and multiplying the second probability that the current target belongs to the preset type S3 in the 1 st sensor and the second probability that the current target belongs to the preset type S3 in the 2 nd sensor and multiplying the second probability that the current target belongs to the preset type S4 in the 1 st sensor and the second probability that the current target belongs to the preset type S4 in the 2 nd sensor, and then fusing according to the following formula:
wherein m is 1 、m 2 、…m n Representing the mass function, m, respectively corresponding to the selected n sensors 1 (A 1 )·m 2 (A 2 )…m n (A n ) Representing the product of the first probabilities that each sensor determines that the current target belongs to a certain preset type; wherein N is more than 1 and less than or equal to N,
it should be understood that the subsequent fusion process for the second probability is the same as the first fusion process, except that the selected sensors are different, specifically, the second selected sensor includes the 1 st, 2 nd, 3 rd sensors, the third selected sensor includes the 1 st, 2 nd, 3 rd, 4 th sensors … … th fusion selected sensor includes the 1 st, 2 nd, … … th, N th sensors, and the fusion process is not described herein.
The multi-sensor fusion target recognition method based on the evidence confidence entropy and the similarity is further described below through simulation experiments.
Assume that there are 15 preset types in the recognition frame, Θ= {1,2, …,14,15}, m (3, 4, 5) =0.05, m (7) =0.05, m (a) i ) =0.8, m (Θ) =0.1; wherein A is i Is 2 Θ I is equal to the number of identification types in a, which in turn increases in value from 1 to 14. Comparing the confidence entropy model provided by the invention with the existing confidence entropy model, and the simulation result is shown in table 1:
TABLE 1
In Table 1, H o Is suitable for the confidence under the DSET frameworkThe entropy model is composed ofProposed, H d Is Dubois and Prade confidence entropy, H p Is the Pal confidence entropy, H pd Is Pan and Deng confidence entropy, H JS Is->Confidence entropy with Shenoy, H JX Is the confidence entropy proposed by the invention.
As can be seen from the results in Table 1, when proposition A i When the identification type of the medium is increased, H p And H o No obvious change occurs, which indicates H p And H o Not accurately measure proposition A i The effect of the type change is identified. H d 、H pd 、H JS And H JX All along with proposition A i The number of recognition types increases. And H is JS In comparison, H JX The basic probability of each proposition, the basic probability and the number of the identification types contained in the identification frame are fully utilized, and the influence of the repeated data on the fusion result can be eliminated. And H is d In comparison, H JX The influence of the change of the identification frame on the uncertainty of the sensor evidence can be reflected. To sum up, H JX The confidence entropy is more excellent than other confidence entropy.
Further, the embodiment performs simulation verification on the multi-sensor target recognition method based on the confidence entropy and the similarity. Assuming that the recognition frame consists of three classes of targets, Θ= { a, b, c }, the first probabilities determined by the five sensors are shown in table 2.
TABLE 2
As can be seen from analysis of the target recognition evidences of the current targets obtained by the five sensors, the first probability that the current target given by the sensor 1, the sensor 3, the sensor 4 and the sensor 5 is the preset type a is maximum, and the first probability that the current target given by the sensor 2 is the preset type b is maximum. Obviously, it can be obtained that the probability that the current target is of the preset type a is maximum, but the sensor 2 is in conflict with other sensor data, so that the sensor data needs to be further fused to obtain a final recognition result.
Comparing the fusion result in the multi-sensor fusion target recognition method based on the evidence confidence entropy and the similarity with the fusion result based on the Yager method, the Murphy method and the Deng method, wherein the result is shown in Table 3:
TABLE 3 Table 3
Obviously, as can be seen from table 3, the multi-sensor fusion target identification method based on evidence confidence entropy and similarity is still stable and reliable under uncertain information, and can quickly identify the current target.
According to the above embodiments, the beneficial effects of the invention are as follows:
the invention provides a multi-sensor fusion target recognition method based on evidence confidence entropy and similarity, which comprises the steps that after a plurality of target recognition evidences of a current target are obtained by each sensor, similarity and uncertainty of the plurality of target recognition evidences of the current target are calculated, uncertainty weight and similarity weight of each target recognition evidence are determined, the first probability is weighted according to the uncertainty weight and the similarity weight, the second probability that the current target belongs to each preset type is obtained, and then K times of fusion is carried out on N sensors after the second probability that the current target corresponding to each sensor belongs to each preset type is obtained, so that a target recognition result is determined.
In the description of the present invention, the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present invention, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, schematic representations of the above terms are not necessarily directed to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Further, one skilled in the art can engage and combine the different embodiments or examples described in this specification.
Although the present application has been described herein in connection with various embodiments, other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed application, from a review of the figures, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the "a" or "an" does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
The foregoing is a further detailed description of the invention in connection with the preferred embodiments, and it is not intended that the invention be limited to the specific embodiments described. It will be apparent to those skilled in the art that several simple deductions or substitutions may be made without departing from the spirit of the invention, and these should be considered to be within the scope of the invention.

Claims (1)

1. A multi-sensor fusion target identification method based on evidence confidence entropy and similarity is characterized by comprising the following steps:
determining a plurality of targets to be identified, and acquiring a plurality of preset types to which the targets belong;
each sensor determines a first probability that a current target belongs to each preset type, and a plurality of target identification evidences of the current target are obtained;
for each sensor, calculating the similarity and uncertainty of a plurality of target recognition evidences of the current target, and determining the uncertainty weight and similarity weight of each target recognition evidence;
weighting the first probability according to the uncertainty weight and the similarity weight to obtain second probabilities that the current target belongs to each preset type;
after obtaining a second probability that a current target corresponding to each sensor belongs to each preset type, carrying out K times of fusion on N sensors, selecting k+1 sensors according to a preset sequence during the K times of fusion, and determining the preset type of the current target according to the obtained K fusion results, wherein K=N-1;
for each sensor, calculating the similarity and uncertainty of a plurality of target recognition evidences of the current target, and determining an uncertainty weight and similarity weight of each target recognition evidence, wherein the step comprises the following steps:
determining uncertainty of a plurality of target recognition evidences of the current target by using a preset confidence entropy model;
determining conflict degrees of a plurality of target recognition evidences of the current target, and determining similarity among the plurality of target recognition evidences of the current target according to the conflict degrees;
determining uncertainty weight and similarity weight of each target recognition evidence according to the uncertainty and similarity of a plurality of target recognition evidences of the current target respectively;
the step of determining the degree of conflict of the multiple target recognition evidences of the current target and determining the similarity between the multiple target recognition evidences of the current target according to the degree of conflict comprises the following steps:
determining the conflict degree among a plurality of target recognition evidences of the current target by utilizing a Minkowski distance to obtain a conflict degree coefficient among the target recognition evidences;
according to the conflict degree coefficient between the target recognition evidences, calculating the similarity between the multiple target recognition evidences of the current target according to the following formula:
sim i,j =1-MDismP(m i ,m j )
wherein MDismP (m i ,m j ) Representing the ith target recognition evidence m corresponding to the current target i Evidence m of object recognition with jth j Coefficient of degree of conflict, sim i,j Representing the ith target recognition evidence m corresponding to the current target i Evidence m of object recognition with jth j Similarity between;
each sensor determines a first probability that a current target belongs to each preset type, and a step of obtaining a plurality of target recognition evidences of the current target comprises the following steps:
the sensors determine first probabilities that the current target belongs to each preset type by using a mass function;
determining a conflict degree coefficient between a plurality of target recognition evidences of the current target according to the following formula:
wherein,a represents all preset types, A t Represents the t-th preset type, |A| represents the number of preset types, M (·) represents the mass function, M (A) t ) Indicating that the current target determined by using mass function belongs to preset type A t And (2) theta represents a preset recognition frame, and P (theta) represents a mapping to [0,1 ] via a mass function],K(m i ,m j ) Representing the collision coefficients in the Dempster combining rule;
the preset confidence entropy model is as follows:
wherein A represents one of a plurality of preset types to which the target belongs,representing the sum of all likelihood functions containing X in A, wherein X represents the base number of a preset target identification frame;
the step of weighting the first probability according to the uncertainty weight and the similarity weight to obtain second probabilities that the current target belongs to each preset type comprises the following steps:
each sensor determines a first weight of each target identification evidence of the current target according to the uncertainty weight and the similarity weight, and multiplies the first weight with a first probability that the current target belongs to each preset type to obtain a second probability that the current target belongs to each preset type;
after obtaining the second probability that the current target corresponding to each sensor belongs to each preset type, K times of fusion are carried out on the N sensors, k+1 sensors are selected according to a preset sequence during the K times of fusion, and the preset type of the current target is determined according to the obtained K fusion results, wherein the method comprises the steps of:
acquiring a second probability that a current target corresponding to each sensor belongs to each preset type;
at the kth fusion, selecting 1 st, 2 nd, … … th and k+1 st sensors from the N sensors;
acquiring a plurality of second probabilities of the current target belonging to the same preset type in the 1 st, 2 nd, … … th and k+1 st sensors, and fusing the plurality of second probabilities to obtain a fusion result;
after K times of fusion are carried out, K fusion results are obtained, and the preset type of the current target is determined according to non-zero values in the K fusion results, wherein K=N-1;
fusing the plurality of second probabilities according to the following formula:
wherein M (A) 1 )、M(A 2 )…M(A n ) Respectively, that the current targets determined by the mass function M (·) belong to the preset type A 1 、A 2 …A n Y represents the fusion result; wherein N is more than 1 and less than or equal to N,
CN202111022948.3A 2021-09-01 2021-09-01 Multi-sensor fusion target identification method based on evidence confidence entropy and similarity Active CN113837237B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111022948.3A CN113837237B (en) 2021-09-01 2021-09-01 Multi-sensor fusion target identification method based on evidence confidence entropy and similarity

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111022948.3A CN113837237B (en) 2021-09-01 2021-09-01 Multi-sensor fusion target identification method based on evidence confidence entropy and similarity

Publications (2)

Publication Number Publication Date
CN113837237A CN113837237A (en) 2021-12-24
CN113837237B true CN113837237B (en) 2024-02-20

Family

ID=78961842

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111022948.3A Active CN113837237B (en) 2021-09-01 2021-09-01 Multi-sensor fusion target identification method based on evidence confidence entropy and similarity

Country Status (1)

Country Link
CN (1) CN113837237B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116304887B (en) * 2023-05-16 2024-02-27 中国电子科技集团公司第五十四研究所 Target identification method based on evidence theory

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008261739A (en) * 2007-04-12 2008-10-30 Mitsubishi Electric Corp Multi-target tracking apparatus
CN105046067A (en) * 2015-07-03 2015-11-11 西北工业大学 Multi-sensor information fusion method based on evidence similarity
CN111625775A (en) * 2020-05-28 2020-09-04 河南大学 Hellinger distance and reliability entropy based weighted conflict evidence fusion method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10068177B2 (en) * 2014-01-20 2018-09-04 Raytheon Company Process of probabilistic multi-source multi-INT fusion benefit analysis

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008261739A (en) * 2007-04-12 2008-10-30 Mitsubishi Electric Corp Multi-target tracking apparatus
CN105046067A (en) * 2015-07-03 2015-11-11 西北工业大学 Multi-sensor information fusion method based on evidence similarity
CN111625775A (en) * 2020-05-28 2020-09-04 河南大学 Hellinger distance and reliability entropy based weighted conflict evidence fusion method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"A Threat Assessment Method Based on Prioritized Visibility Graph Ordered Weighted Averaging Operator";Deyun Zhou et al.;《International Conference on Information and Automation》;全文 *
"基于多目标优化算法的多无人机协同航迹规划";周德云等;《***工程与电子技术》;第39卷(第4期);全文 *

Also Published As

Publication number Publication date
CN113837237A (en) 2021-12-24

Similar Documents

Publication Publication Date Title
US6658168B1 (en) Method for retrieving image by using multiple features per image subregion
CN110852755B (en) User identity identification method and device for transaction scene
EP3518001B1 (en) Method for increasing the reliability of determining the position of a vehicle on the basis of a plurality of detection points
CN112132042A (en) SAR image target detection method based on anti-domain adaptation
US20030191610A1 (en) Method and system for multi-sensor data fusion using a modified dempster-shafer theory
CN107451619A (en) A kind of small target detecting method that confrontation network is generated based on perception
CN102393881B (en) A kind of high-precision detecting method of real-time many sensing temperatures data fusion
CN109975798B (en) Target detection method based on millimeter wave radar and camera
KR20110116565A (en) Method determining indoor location using bayesian algorithm
CN110688883A (en) Vehicle and pedestrian detection method and device
CN113837237B (en) Multi-sensor fusion target identification method based on evidence confidence entropy and similarity
CN108764348B (en) Data acquisition method and system based on multiple data sources
CN110363165A (en) Multi-object tracking method, device and storage medium based on TSK fuzzy system
CN111814846B (en) Training method and recognition method of attribute recognition model and related equipment
CN111428793A (en) Evidence fusion method based on improved evidence dissimilarity
Baran et al. Statistical post‐processing of heat index ensemble forecasts: Is there a royal road?
CN109190647B (en) Active and passive data fusion method
CN114626744A (en) Scientific and technological innovation capability-based assessment method and system and readable storage medium
CN113295421A (en) Engine fault diagnosis method based on improved conflict coefficient and reliability entropy
CN111860623A (en) Method and system for counting tree number based on improved SSD neural network
CN114648683B (en) Neural network performance improving method and device based on uncertainty analysis
CN116229419A (en) Pedestrian detection method and device
CN113255820B (en) Training method for falling-stone detection model, falling-stone detection method and related device
CN112001381B (en) Intelligent pre-filling bill auditing method and device
CN117396919A (en) Information processing method, program, and information processing apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant