CN113740843B - Motion state estimation method and system for tracking target and electronic device - Google Patents

Motion state estimation method and system for tracking target and electronic device Download PDF

Info

Publication number
CN113740843B
CN113740843B CN202111044918.2A CN202111044918A CN113740843B CN 113740843 B CN113740843 B CN 113740843B CN 202111044918 A CN202111044918 A CN 202111044918A CN 113740843 B CN113740843 B CN 113740843B
Authority
CN
China
Prior art keywords
tracking
photoelectric
target
angle measurement
radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111044918.2A
Other languages
Chinese (zh)
Other versions
CN113740843A (en
Inventor
陈大鹏
王长城
樊鹏
李文才
康林
董琦昕
黄佳乐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China South Industries Group Automation Research Institute
Original Assignee
China South Industries Group Automation Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China South Industries Group Automation Research Institute filed Critical China South Industries Group Automation Research Institute
Priority to CN202111044918.2A priority Critical patent/CN113740843B/en
Publication of CN113740843A publication Critical patent/CN113740843A/en
Application granted granted Critical
Publication of CN113740843B publication Critical patent/CN113740843B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention discloses a motion state estimation method, a system and an electronic device of a tracking target, and also discloses a computer readable storage medium, wherein the motion state estimation method of the tracking target comprises the following operations: carrying out real-time statistics on the laser echo rate obtained by the photoelectric tracking equipment; judging the laser echo rate: when the laser echo rate is greater than or equal to a set threshold value, only inputting the data monitored by the photoelectric tracking equipment into a filtering algorithm as a variable; when the laser echo rate is smaller than a set threshold value, the obtained photoelectric angle measurement and radar ranging data are input into a filtering algorithm as variables; and estimating the motion state of the tracking target according to a filtering algorithm after the variables are input. The method can fully utilize the high-precision angle measurement information of the photoelectric tracking equipment, thereby greatly improving the state estimation precision of the tracking system on the moving target.

Description

Motion state estimation method and system for tracking target and electronic device
Technical Field
The present invention relates to tracking of a target, and in particular, to a method, a system, and an electronic device for estimating a motion state of a tracked target, and a computer readable storage medium.
Background
In the target tracking system, the angle measurement and the distance measurement are required to be carried out on the target in real time, and the position information, the speed information and the acceleration information of the target are obtained through an estimation (filtering) algorithm by using the obtained position information with measurement noise, wherein photoelectric tracking equipment and tracking radar are widely applied.
Due to the different working principles, the two kinds of observation devices have respective advantages and disadvantages. The angle measurement and the distance measurement of the photoelectric tracking equipment can reach high precision, however, the detection range and the distance measurement effect are easily affected by the environment, and in addition, the distance measurement frequency of the photoelectric tracking equipment is limited by the physical characteristics of a laser distance measuring machine, and the common distance measurement frequency is 12.5Hz at present; although the angle measurement and the distance measurement precision of the tracking radar are not as good as those of photoelectricity, the tracking radar can be used all-weather, is not easily affected by environment, and the distance measurement frequency can reach 50Hz.
In the existing target tracking system configured with photoelectric and tracking radars, when a double tracker tracks the same target, the state estimation is carried out on the moving target by using the observation data of a single detector, and the use method cannot fully exert the respective advantages of the two detectors, so that the tracking system cannot utilize photoelectric high-precision angle measurement information under the condition of only photoelectric angle measurement information and no ranging information, thereby reducing the estimation precision of the moving state of the target.
Disclosure of Invention
The invention provides a motion state estimation method, a motion state estimation system and an electronic device for a tracked target, and also provides a computer readable storage medium for the situation that the estimation precision of the motion state of the target in a target tracking system is still not high, so that the motion state of the tracked target has higher precision.
The invention is realized by the following technical scheme:
in a first aspect of the present application, a motion state estimation method for tracking a target is adopted, including tracking the same target by using an optoelectronic tracking device and a radar tracking device, where the motion state estimation method for tracking a target includes the following operations:
Carrying out real-time statistics on the laser echo rate obtained by the photoelectric tracking equipment; judging the laser echo rate: when the laser echo rate is greater than or equal to a set threshold value, only inputting data monitored by the photoelectric tracking equipment into a filtering algorithm as a variable; when the laser echo rate is smaller than a set threshold value, inputting the obtained photoelectric angle measurement and radar ranging data into a filtering algorithm as variables; and estimating the motion state of the tracking target according to a filtering algorithm after the variables are input.
According to the motion state estimation method for the tracked target, under the condition that the dual trackers track the same target, after filtering is stable, if the laser echo rate is smaller than the set threshold value, effective photoelectric angle measurement and radar ranging data are obtained instead of angle measurement and ranging data of the radar, and the method can fully utilize high-precision angle measurement information of the photoelectric tracking equipment, so that the state estimation precision of a tracking system on the moving target is greatly improved.
Further, in the method, if the obtained photoelectric angle measurement and radar angle measurement data are invalid, that is, the obtained photoelectric angle measurement and radar angle measurement data are valid at different times, the motion state of the tracking target can be estimated in other modes. The motion state estimation method of the tracking target further comprises the following steps: when only photoelectric angle measurement and ranging data are judged to be effective, only data monitored by photoelectric tracking equipment are used as variables to be input into a filtering algorithm; or the motion state estimation method of the tracking target further comprises the following steps: when the radar angle measurement and ranging data are judged to be valid, the radar angle measurement and ranging data are input into a filtering algorithm as variables. The motion state estimation method of the sample tracking target can still keep estimating the motion state of the tracking target under various conditions.
Specifically, before the laser echo rate is determined, whether the filtering is stable is determined, wherein the stability refers to that the output of the filtering algorithm reaches a stable state after a plurality of periods of operation, i.e. the output is not divergent.
When the filtering stability is judged, and the obtained photoelectric angle measurement data and radar ranging data are invalid, judging whether the radar angle measurement data are valid or not:
If the radar angle measurement and ranging data are valid, the radar angle measurement and ranging data are used as variables to be input into a filtering algorithm, and the motion state of a tracking target is estimated according to the filtering algorithm after the variables are input; if the radar angle measurement and ranging data are invalid, judging whether the photoelectric angle measurement and ranging data are valid or not: when the photoelectric angular ranging data are effective, the photoelectric angular ranging data are used as variables to be input into a filtering algorithm, and the motion state of a tracking target is estimated according to the filtering algorithm after the variables are input.
Specifically, if the filtering is unstable, judging whether the photoelectric angle measurement and ranging data are valid or not:
when the photoelectric angle measurement and distance measurement data are valid, only the photoelectric angle measurement and distance measurement data are used as variables to be input into a filtering algorithm, and the motion state of a tracking target is estimated according to the filtering algorithm after the variables are input; when the photoelectric angle measurement and ranging data are valid, judging whether the radar angle measurement and ranging data are valid or not: if the radar angle measurement and ranging data are valid, the radar angle measurement and ranging data are used as variables to be input into a filtering algorithm, and the motion state of a tracking target is estimated according to the filtering algorithm after the variables are input.
Further, when the photoelectric tracking device and the radar tracking device are found to not track the same target, the following procedure can be performed:
Judging whether the photoelectric tracking device and the radar tracking device track the same target, if not, judging whether the photoelectric tracking device tracks the target, if not, judging whether the radar tracking device tracks the target. The photoelectric measurement precision is higher than that of the radar, so that photoelectric measurement information is preferential, and the photoelectric tracking equipment does not track the target, and then whether the radar tracking equipment tracks the target is judged
Specifically, in a state that the photoelectric tracking equipment and the radar tracking equipment do not track the same target, if the photoelectric tracking equipment tracks the target, only data monitored by the photoelectric tracking equipment are used as variables to be input into a filtering algorithm, and the motion state of the tracked target is estimated according to the filtering algorithm after the variables are input; or under the condition that the photoelectric tracking equipment and the radar tracking equipment do not track the same target, if the radar tracking equipment tracks the target, radar angle measurement and ranging data are input into a filtering algorithm as variables, and the motion state of the tracked target is estimated according to the filtering algorithm after the variables are input.
Further, the calculation of the laser echo rate dynamically calculates the laser echo rate by using a sliding window method.
The sliding Window algorithm controls traffic by limiting the maximum number of cells that can be received in each time Window. In the sliding window algorithm, the sliding is performed once every cell time, and the sliding length is the time of one cell.
The calculation formula for calculating the laser echo rate is as follows
Wherein eta is the laser echo rate, N is the effective number of laser data in the sliding window, L is the length of the sliding window, and F is the laser ranging frequency.
In a second aspect of the invention, the invention also provides an electronic device comprising a processor and a memory; the memory is used for storing processor executable instructions; the processor is configured to perform the method of motion state estimation of a tracked object in the first aspect and any refinements described above.
By adopting the electronic device, the photoelectric tracking equipment and the radar tracking equipment are controlled, so that the high-precision angle measurement information of the photoelectric tracking equipment can be fully utilized, and the state estimation precision of the tracking system on the moving target is improved.
In a third aspect of the present invention, the present invention also provides a computer readable storage medium, which includes a stored computer program, which when executed performs the motion state estimation method of the tracking target in the first aspect and any of the modifications.
By using the computer readable storage medium, the high-precision angle measurement information of the photoelectric tracking equipment can be fully utilized, and the state estimation precision of the tracking system on the moving target is improved.
In a fourth aspect of the present invention, the present invention further provides a motion state estimation system for a tracking target, where the motion state estimation system for a tracking target includes an optoelectronic tracking device and a radar tracking device, and further includes the electronic apparatus of the second aspect connected to the optoelectronic tracking device and the radar tracking device, respectively.
Compared with the existing target tracking system, the motion state estimation system for tracking the target can fully exert the advantages of photoelectric tracking equipment and radar tracking equipment, so that the tracking system can utilize photoelectric high-precision angle measurement information under the condition that only the photoelectric angle measurement information and no ranging information exist, and the estimation precision of the motion state of the target is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of embodiments of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the principles of the application. In the drawings:
fig. 1 is a flowchart for explaining the steps of the motion state estimation method of the tracking target according to the first embodiment;
FIG. 2 is a schematic diagram illustrating a motion state estimation system of a tracking target according to the present invention;
FIG. 3 is a schematic diagram illustrating an electronic device according to the present invention;
Fig. 4 is a flowchart illustrating the steps of the motion state estimation method of the tracking target according to the second embodiment;
fig. 5 is a flowchart for explaining the steps of the motion state estimation method of the tracking target of the third embodiment;
Fig. 6 is a flowchart showing the steps performed in the motion state estimation method of the tracking target according to the third embodiment;
Reference numerals and corresponding part names:
1-electronic device, 2-photoelectric tracking equipment, 3-radar tracking equipment, 101-processor, 102-memory, 103-communication interface, 104-bus.
Detailed Description
For the purpose of making apparent the objects, technical solutions and advantages of the present invention, the present invention will be further described in detail with reference to the following examples and the accompanying drawings, wherein the exemplary embodiments of the present invention and the descriptions thereof are for illustrating the present invention only and are not to be construed as limiting the present invention.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be apparent to one of ordinary skill in the art that: no such specific details are necessary to practice the invention. In other instances, well-known structures, circuits, materials, or methods have not been described in detail in order not to obscure the invention.
Throughout the specification, references to "one embodiment," "an embodiment," "one example," or "an example" mean: a particular feature, structure, or characteristic described in connection with the embodiment or example is included within at least one embodiment of the invention. Thus, the appearances of the phrases "in one embodiment," "in an example," or "in an example" in various places throughout this specification are not necessarily all referring to the same embodiment or example. Furthermore, the particular features, structures, or characteristics may be combined in any suitable combination and/or sub-combination in one or more embodiments or examples. Moreover, those of ordinary skill in the art will appreciate that the illustrations provided herein are for illustrative purposes and that the illustrations are not necessarily drawn to scale. The term "and/or" as used herein includes any and all combinations of one or more of the associated listed items.
In the description of the present invention, it should be understood that the terms "front", "rear", "left", "right", "upper", "lower", "vertical", "horizontal", "high", "low", "inner", "outer", etc. indicate orientations or positional relationships based on the drawings, are merely for convenience in describing the present invention and simplifying the description, and do not indicate or imply that the apparatus or elements referred to must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the scope of the present invention.
The embodiment adopts a motion state estimation method of a tracking target, which is used for tracking the same target by photoelectric tracking equipment and radar tracking equipment, and comprises the following operations:
Carrying out real-time statistics on the laser echo rate obtained by the photoelectric tracking equipment; judging the laser echo rate: when the laser echo rate is greater than or equal to a set threshold value, only inputting data monitored by the photoelectric tracking equipment into a filtering algorithm as a variable; when the laser echo rate is smaller than a set threshold value, inputting the obtained photoelectric angle measurement and radar ranging data into a filtering algorithm as variables; and estimating the motion state of the tracking target according to a filtering algorithm after the variables are input.
When the filtering is stable, if the laser echo rate is smaller than the set threshold value, effective photoelectric angle measurement and radar distance measurement data are obtained instead of angle measurement and distance measurement data of the radar, and the method can fully utilize the high-precision angle measurement information of the photoelectric tracking equipment, so that the state estimation precision of the tracking system on the moving target is greatly improved.
As shown in fig. 2 and 3, the present embodiment provides a motion state estimation system for tracking a target, including an electronic apparatus 1, a photoelectric tracking device 2, and a radar tracking device 3, where the electronic apparatus 1 includes a processor 101 and a memory 102; the memory 102 is used for storing instructions executable by the processor 101; the processor 101 is configured to perform the motion state estimation method of the tracking target of the present application.
The electronic device 1 is electrically connected with the photoelectric tracking device 2 and the radar tracking device 3, the photoelectric tracking device 2 can be any electronic device based on the photoelectric tracking measurement principle, such as a photoelectric pod, a photoelectric tracker, a laser ranging device and the like, and tracking in this embodiment is automatic tracking of the target by the device. The radar tracking device 3 may be a radar that continuously tracks a target and measures coordinates of the target, and the radar tracking device 3 is composed of a distance tracking branch, an azimuth tracking branch, and an elevation tracking branch. They each accomplish automatic tracking of the distance, azimuth and elevation of the target and continuously measure the distance, azimuth and elevation of the target.
After receiving feedback information of the photoelectric tracking device 2 and the radar tracking device 3 for tracking the target, the processor 101 in the electronic device 1 performs program processing of the motion state estimation method of the tracking target to obtain a real-time result of estimating the motion state of the tracking target.
The calculation of the laser echo rate dynamically calculates the laser echo rate by using a sliding window method. The calculation formula for calculating the laser echo rate is as follows
Wherein eta is the laser echo rate, N is the effective number of laser data in the sliding window, L is the length of the sliding window, and F is the laser ranging frequency. In this embodiment, the sliding window length L is 1.6 seconds to 2.4 seconds when the laser ranging frequency is 12.5 Hz. The threshold range of the laser echo rate is 40% -60%.
When the real-time counted laser echo rate of the electronic device 1 is more than 40% -60%, only the data monitored by the photoelectric tracking equipment 2 are used as variables to be input into a filtering algorithm; when the laser echo rate is less than 40% -60%, the obtained photoelectric angle measurement and radar ranging data are input into a filtering algorithm as variables. The filtering algorithm provided by the embodiment refers to Kalman filtering, the target motion model is a Singer model, and the current position, speed and acceleration of the target are estimated according to the measured target position data with noise.
As shown in fig. 1, the specific implementation steps are as follows:
And A1, carrying out real-time statistics on the laser echo rate acquired by the photoelectric tracking equipment 2.
A2, judging the laser echo rate:
a2.1, when the laser echo rate is more than 40% -60%, only the data monitored by the photoelectric tracking equipment 2 are used as variables to be input into a filtering algorithm.
A2.2, when the laser echo rate is smaller than 40% -60%, the obtained photoelectric angle measurement and radar ranging data are input into a filtering algorithm as variables.
A3, estimating the motion state of the tracking target according to a filtering algorithm after the input variables.
In the electronic apparatus 1 of the present embodiment, the processor 101 may include a central processing unit 101 (CPU), or an Application SPECIAL INTEGRATED Circuit (ASIC), or one or more integrated circuits configured to implement the motion state estimation method of the tracking target.
The memory 102 may include mass storage 102 for data, which may include for data or instructions. By way of example, and not limitation, memory 102 may comprise a hard disk drive (HARD DISK DRIVE, HDD), a floppy disk drive, flash memory, optical disk, magneto-optical disk, magnetic tape, or a universal serial bus (Universal Serial Bus, USB) drive, or a combination of two or more of the foregoing. Memory 102 may include removable or non-removable (or fixed) media, where appropriate. The memory 102 may be internal or external to the data processing apparatus, where appropriate. In a particular embodiment, the memory 102 is a non-volatile solid-state memory 102. In a particular embodiment, the memory 102 includes read only memory 102 (ROM). The ROM may be mask programmed ROM, programmable ROM (PROM), erasable PROM (EPROM), electrically Erasable PROM (EEPROM), electrically rewritable ROM (EAROM), or flash memory, or a combination of two or more of these, where appropriate. The processor 101 implements the above-described motion state estimation method of the tracking target by reading and executing the computer program instructions stored in the memory 102.
In a further embodiment of the present electronic device 1, the present electronic device 1 may further comprise a communication interface 103 and a bus 104. As shown in fig. 5, the processor 101, the memory 102, and the communication interface 103 are connected to each other by a bus 104 and perform communication with each other.
The communication interface 103 is mainly used for realizing communication among various modules, devices, units and/or equipment required by the motion state estimation method of the tracking target. The bus 104 includes hardware, software, or both, coupling the components of the present electronic device 1 to one another. By way of example, and not limitation, the buses may include an Accelerated Graphics Port (AGP) or other graphics bus, an Enhanced Industry Standard Architecture (EISA) bus, a Front Side Bus (FSB), a HyperTransport (HT) interconnect, an Industry Standard Architecture (ISA) bus, an infiniband interconnect, a Low Pin Count (LPC) bus, a memory 102 bus, a micro channel architecture (MCa) bus, a Peripheral Component Interconnect (PCI) bus, a PCI-Express (PCI-X) bus, a Serial Advanced Technology Attachment (SATA) bus, a video electronics standards association local (VLB) bus, or other suitable bus, or a combination of two or more of the above. Bus 104 may include one or more buses, where appropriate. Although a particular bus is described and illustrated, the present invention contemplates any suitable bus or interconnect.
Example 2
In addition to the motion state estimation system of a tracking target according to embodiment 1, the motion state estimation system of another tracking target according to embodiment 2 may also be used to estimate the motion state of the tracking target in other manners when the obtained photoelectric angle measurement and radar ranging data are invalid. The invalidation of data as referred to herein means that no new data is received during an operation cycle, and that the data cannot be used as input to the filtering algorithm during this cycle.
The embodiment provides a motion state estimation system for tracking a target, which comprises an electronic device 1, a photoelectric tracking device 2 and a radar tracking device 3, wherein the electronic device 1 comprises a processor 101 and a memory 102; the memory 102 is used for storing instructions executable by the processor 101; the processor 101 is configured to execute the motion state estimation method of the tracking target employed in embodiment 1, and the motion state estimation method of the tracking target further includes:
When only photoelectric angle measurement and ranging data are judged to be effective, only data monitored by the photoelectric tracking equipment 2 are used as variables to be input into a filtering algorithm; or the motion state estimation method of the tracking target further comprises the following steps: when the radar angle measurement and ranging data are judged to be valid, the radar angle measurement and ranging data are input into a filtering algorithm as variables.
In a preferred embodiment of the present invention, the filtering is determined to be stable before the laser echo rate is determined.
When the filtering stability is judged, and the obtained photoelectric angle measurement data and radar ranging data are invalid, judging whether the radar angle measurement data are valid or not:
If the radar angle measurement and ranging data are valid, the radar angle measurement and ranging data are used as variables to be input into a filtering algorithm, and the motion state of a tracking target is estimated according to the filtering algorithm after the variables are input; if the radar angle measurement and ranging data are invalid, judging whether the photoelectric angle measurement and ranging data are valid or not: when the photoelectric angular ranging data are effective, the photoelectric angular ranging data are used as variables to be input into a filtering algorithm, and the motion state of a tracking target is estimated according to the filtering algorithm after the variables are input.
If the filtering is unstable, judging whether the photoelectric angle measurement and ranging data are valid or not:
When the photoelectric angle measurement and distance measurement data are valid, only the photoelectric angle measurement and distance measurement data are used as variables to be input into a filtering algorithm, and the motion state of a tracking target is estimated according to the filtering algorithm after the variables are input; when the photoelectric angle measurement and ranging data are invalid, judging whether the radar angle measurement and ranging data are valid or not: if the radar angle measurement and ranging data are valid, the radar angle measurement and ranging data are used as variables to be input into a filtering algorithm, and the motion state of a tracking target is estimated according to the filtering algorithm after the variables are input.
As shown in fig. 4, the specific implementation steps are as follows:
b0, judging whether the filtering obtained before the laser echo rate is in a stable state, if so, performing a step B1, otherwise, performing a step B2;
B1, if the acquired photoelectric angle measurement data and radar ranging data are invalid, step B1.2 is advanced, if the acquired photoelectric angle measurement data and radar ranging data are valid, the acquired photoelectric angle measurement data and radar ranging data are input into a filtering algorithm as variables, and step B3 is performed;
B1.2, judging whether radar angle measurement and ranging data are valid, effectively performing step B1.2.1, and performing step B1.2.2 in an invalid mode;
b1.2.1, if radar angle measurement and ranging data are valid, adopting the radar angle measurement and ranging data as variables to input the variables into a filtering algorithm, and performing a step B3;
B1.2.2, if the radar angular ranging data are invalid, judging whether the photoelectric angular ranging data are valid, if so, performing step B1.2.2.1, and if not, performing step B1.2.2.2;
B1.2.2.1, when the photoelectric angle measurement and distance measurement data are valid, adopting the photoelectric angle measurement and distance measurement data as variables to input the variables into a filtering algorithm, and performing a step B3;
B1.2.2.2, when the photoelectric angle measurement data are invalid, performing a step B4;
b2, judging whether the photoelectric angle measurement and ranging data are valid, effectively performing the step B2.1, and performing the step 2.2 in an invalid manner;
B2.1, when the photoelectric angle measurement and distance measurement data are valid, only adopting the photoelectric angle measurement and distance measurement data as variables to input the photoelectric angle measurement and distance measurement data into a filtering algorithm, and performing a step B3;
B2.2, when the photoelectric angle measurement and ranging data are invalid, judging whether the radar angle measurement and ranging data are valid or not, if so, performing step B2.2.1, and if not, performing step B2.2.2;
B2.2.1, if radar angle measurement and ranging data are effective, adopting the radar angle measurement and ranging data as variables to input the variable into a filtering algorithm, and performing a step B3;
b2.2.2, if radar angle measurement data are invalid, performing a step B4;
And B3, estimating the motion state of the tracking target according to a filtering algorithm after the input variable.
And B4, judging that the photoelectric angle measurement and ranging data and the radar angle measurement and ranging data are invalid.
Example 3
Such processing is performed when the electronic apparatus 1 of the present application has a case where it is judged that the photoelectric tracking device 2 or/and the radar tracking device 3 are not performing tracking of the target. In view of this situation, embodiment 3 provides a motion state estimation system for a tracked target on the basis of embodiment 1 or 2, so that the motion state of the tracked target can still be estimated when it is judged that the photoelectric tracking device 2 and/or the radar tracking device 3 are not tracking the target.
In the motion state estimation system for tracking a target of the present embodiment, the motion state estimation method for a tracking target executed by the processor 101 of the electronic device 1 further includes determining whether the photoelectric tracking apparatus 2 and the radar tracking apparatus 3 track the same target, if the same target is not tracked, determining whether the photoelectric tracking apparatus 2 tracks the target, if the photoelectric tracking apparatus 2 does not track the target, determining whether the radar tracking apparatus 3 tracks the target.
Under the condition that the photoelectric tracking equipment 2 and the radar tracking equipment 3 do not track the same target, if the photoelectric tracking equipment 2 tracks the target, only data monitored by the photoelectric tracking equipment 2 is used as a variable to be input into a filtering algorithm, and the motion state of the tracked target is estimated according to the filtering algorithm after the variable is input; or under the condition that the photoelectric tracking device 2 and the radar tracking device 3 do not track the same target, if the radar tracking device 3 tracks the target, radar angle measurement and ranging data are input into a filtering algorithm as variables, and the motion state of the tracked target is estimated according to the filtering algorithm after the variables are input.
This example 3 is exemplified on the basis of example 2, and the following steps are performed:
When two trackers track the same target, under the condition of insufficient photoelectric laser echo rate, the photoelectric high-precision angle measurement information is utilized to improve the estimation precision of the state of the moving target. As shown in fig. 5, the specific steps are as follows:
s1, when two trackers track the same target, the laser echo rate is counted in real time in each calculation period.
The laser echo rate is counted by adopting a sliding window method, and the sliding window length L is 1.6-2.4 seconds under the condition that the laser ranging frequency is 12.5 Hz. Assuming that the number of effective laser echoes in the sliding window is N in one calculation period, the calculation formula of the laser echo rate is:
S2, judging whether the photoelectric and radar are in an automatic tracking state at the same time, if yes, turning to the step 3, and if no, turning to the step S11;
S3, judging whether filtering is stable, if not, turning to a step S4; if yes, go to step S6;
S4, judging whether the photoelectric angle measurement and the photoelectric distance measurement are effective, and if yes, acquiring photoelectric angle measurement and distance measurement data; if not, turning to S5;
S5, judging whether radar angle measurement and ranging data are valid, and if yes, acquiring the angle measurement and ranging data of the radar; if not, the data in the current period is invalid;
S6, judging whether the laser echo rate eta is larger than or equal to a set threshold value theta (the value range of the theta is 40-60 percent), if so, turning to the step S7; if not, turning to the step S8;
s7, judging whether the photoelectric angle measurement and the photoelectric distance measurement are effective, and if yes, acquiring photoelectric angle measurement and distance measurement data; if not, setting the period measurement data to be invalid;
S8, judging whether photoelectric angle measurement and radar angle measurement data are valid or not, and if yes, acquiring photoelectric angle measurement and radar distance measurement data; if not, turning to the step S9;
s9, judging whether radar angle measurement and ranging data are valid, and if yes, acquiring radar angle measurement and ranging data; if not, turning to S10;
S10, judging whether photoelectric angle measurement and ranging are effective, and if yes, acquiring photoelectric angle measurement and ranging data; if not, setting the period measurement data to be invalid;
S11, judging whether the photoelectricity is in an automatic tracking state, if yes, turning to a step S12; if not, turning to S13;
S12, judging whether photoelectric angle measurement and ranging are effective, and if yes, acquiring photoelectric angle measurement and ranging data; if not, setting the period measurement data to be invalid;
s13, judging whether the radar is in an automatic tracking state, if not, invalidating the periodic measurement data; if yes, go to step S14;
s14, judging whether radar angle measurement and ranging data are valid, and if yes, acquiring the angle measurement and ranging data of the radar; if not, setting the period measurement data to be invalid;
The steps S1 to S14 can acquire target position measurement data of one calculation cycle, which is input data of the filtering algorithm.
According to the technical scheme, under the condition that the double trackers track the same target, after filtering is stable, if the laser echo rate is smaller than the set threshold value theta (step S6 in the technical scheme), effective photoelectric angle measurement and radar distance measurement data are obtained instead of angle measurement and distance measurement data of the radar, and the method can fully utilize high-precision angle measurement information of the photoelectric tracking equipment 2 and improve the state estimation precision of a tracking system on the moving target.
In combination with example 1, example 2 and example 3 can give a flow chart as shown in fig. 6.
The foregoing description of the embodiments has been provided for the purpose of illustrating the general principles of the invention, and is not meant to limit the scope of the invention, but to limit the invention to the particular embodiments, and any modifications, equivalents, improvements, etc. that fall within the spirit and principles of the invention are intended to be included within the scope of the invention.

Claims (9)

1. The motion state estimation method for the tracked target is characterized by being used for tracking the same target by photoelectric tracking equipment and radar tracking equipment;
the motion state estimation method of the tracking target comprises the following operations:
carrying out real-time statistics on the laser echo rate obtained by the photoelectric tracking equipment;
judging the laser echo rate:
When the laser echo rate is greater than or equal to a set threshold value, only inputting data monitored by the photoelectric tracking equipment into a filtering algorithm as a variable;
When the laser echo rate is smaller than a set threshold value, inputting the obtained photoelectric angle measurement and radar ranging data into a filtering algorithm as variables;
Estimating the motion state of the tracking target according to a filtering algorithm after the variables are input;
judging whether filtering is stable or not before judging the laser echo rate;
When the filtering stability is judged, and the obtained photoelectric angle measurement data and radar ranging data are invalid, judging whether the radar angle measurement data are valid or not:
If the radar angle measurement and ranging data are valid, the radar angle measurement and ranging data are used as variables to be input into a filtering algorithm, and the motion state of a tracking target is estimated according to the filtering algorithm after the variables are input;
if the radar angle measurement and ranging data are invalid, judging whether the photoelectric angle measurement and ranging data are valid or not:
when the photoelectric angular ranging data are effective, the photoelectric angular ranging data are used as variables to be input into a filtering algorithm, and the motion state of a tracking target is estimated according to the filtering algorithm after the variables are input.
2. The method for motion state estimation of a tracking target according to claim 1, wherein,
The motion state estimation method of the tracking target further comprises the following steps:
when only photoelectric angle measurement and ranging data are judged to be effective, only data monitored by photoelectric tracking equipment are used as variables to be input into a filtering algorithm; or alternatively
The motion state estimation method of the tracking target further comprises the following steps:
When the radar angle measurement and ranging data are judged to be valid, the radar angle measurement and ranging data are input into a filtering algorithm as variables.
3. The method for motion state estimation of a tracking target according to claim 1, wherein,
Judging whether filtering is stable or not before judging the laser echo rate;
If the filtering is unstable, judging whether the photoelectric angle measurement and ranging data are valid or not:
When the photoelectric angle measurement and distance measurement data are valid, only the photoelectric angle measurement and distance measurement data are used as variables to be input into a filtering algorithm, and the motion state of a tracking target is estimated according to the filtering algorithm after the variables are input;
when the photoelectric angle measurement and ranging data are valid, judging whether the radar angle measurement and ranging data are valid or not:
If the radar angle measurement and ranging data are valid, the radar angle measurement and ranging data are used as variables to be input into a filtering algorithm, and the motion state of a tracking target is estimated according to the filtering algorithm after the variables are input.
4. The method for motion state estimation of a tracking target according to claim 1, wherein,
Judging whether the photoelectric tracking device and the radar tracking device track the same target, if not, judging whether the photoelectric tracking device tracks the target, if not, judging whether the radar tracking device tracks the target.
5. The method for motion state estimation of a tracking target according to claim 4,
Under the condition that the photoelectric tracking equipment and the radar tracking equipment do not track the same target, if the photoelectric tracking equipment tracks the target, only data monitored by the photoelectric tracking equipment are used as variables to be input into a filtering algorithm, and the motion state of the tracked target is estimated according to the filtering algorithm after the variables are input; or alternatively
Under the condition that the photoelectric tracking equipment and the radar tracking equipment do not track the same target, if the radar tracking equipment tracks the target, radar angle measurement and ranging data are input into a filtering algorithm as variables, and the motion state of the tracked target is estimated according to the filtering algorithm after the variables are input.
6. The method for motion state estimation of a tracking target according to claim 1, wherein,
The calculation of the laser echo rate utilizes a sliding window method to dynamically calculate the laser echo rate; the calculation formula is that
Wherein eta is the laser echo rate, N is the effective number of laser data in the sliding window, L is the length of the sliding window, and F is the laser ranging frequency.
7. An electronic device comprising a processor and a memory;
The memory is used for storing processor executable instructions;
the processor is configured to perform the method of motion state estimation of a tracked object according to any one of claims 1-6.
8. A computer-readable storage medium, characterized in that: comprising a stored computer program which, when run, performs the method of motion state estimation of a tracking target according to any one of claims 1-6.
9. A motion state estimation system for tracking a target, comprising an optoelectronic tracking device, a radar tracking device, and the electronic apparatus of claim 7 connected to the optoelectronic tracking device and the radar tracking device, respectively.
CN202111044918.2A 2021-09-07 2021-09-07 Motion state estimation method and system for tracking target and electronic device Active CN113740843B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111044918.2A CN113740843B (en) 2021-09-07 2021-09-07 Motion state estimation method and system for tracking target and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111044918.2A CN113740843B (en) 2021-09-07 2021-09-07 Motion state estimation method and system for tracking target and electronic device

Publications (2)

Publication Number Publication Date
CN113740843A CN113740843A (en) 2021-12-03
CN113740843B true CN113740843B (en) 2024-05-07

Family

ID=78736630

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111044918.2A Active CN113740843B (en) 2021-09-07 2021-09-07 Motion state estimation method and system for tracking target and electronic device

Country Status (1)

Country Link
CN (1) CN113740843B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115436902B (en) * 2022-09-15 2024-06-14 中国人民解放军国防科技大学 Angle error estimation method and device based on three-channel joint detection

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011056107A1 (en) * 2009-11-06 2011-05-12 Saab Ab Radar system and method for detecting and tracking a target
JP2011242181A (en) * 2010-05-17 2011-12-01 Mitsubishi Electric Corp Target tracking device
CN110103958A (en) * 2018-01-30 2019-08-09 丰田自动车工程及制造北美公司 Vehicle in front sensing data is merged to be used for the detection and ranging in preceding object
CN110346788A (en) * 2019-06-14 2019-10-18 北京雷久科技有限责任公司 The high motor-driven and hovering full Track In Track method of target merged based on radar and photoelectricity
WO2020135810A1 (en) * 2018-12-29 2020-07-02 华为技术有限公司 Multi-sensor data fusion method and device
CN112904328A (en) * 2021-01-18 2021-06-04 安徽瞭望科技有限公司 Radar photoelectric tracking early warning system and early warning method for offshore wind farm
CN113311398A (en) * 2021-05-31 2021-08-27 零八一电子集团有限公司 Tracking method for high maneuvering dim small target with strong clutter complex background

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10565468B2 (en) * 2016-01-19 2020-02-18 Aptiv Technologies Limited Object tracking system with radar/vision fusion for automated vehicles

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011056107A1 (en) * 2009-11-06 2011-05-12 Saab Ab Radar system and method for detecting and tracking a target
JP2011242181A (en) * 2010-05-17 2011-12-01 Mitsubishi Electric Corp Target tracking device
CN110103958A (en) * 2018-01-30 2019-08-09 丰田自动车工程及制造北美公司 Vehicle in front sensing data is merged to be used for the detection and ranging in preceding object
WO2020135810A1 (en) * 2018-12-29 2020-07-02 华为技术有限公司 Multi-sensor data fusion method and device
CN110346788A (en) * 2019-06-14 2019-10-18 北京雷久科技有限责任公司 The high motor-driven and hovering full Track In Track method of target merged based on radar and photoelectricity
CN112904328A (en) * 2021-01-18 2021-06-04 安徽瞭望科技有限公司 Radar photoelectric tracking early warning system and early warning method for offshore wind farm
CN113311398A (en) * 2021-05-31 2021-08-27 零八一电子集团有限公司 Tracking method for high maneuvering dim small target with strong clutter complex background

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Development of photoelectric theodolite real-time guide;Wu Neng-wei,et al;Acta Photonica Sinica;第36卷(第10期);1965-1968 *
IMM-UKF算法在两坐标雷达-光电融合跟踪***中的改进与应用;李珂;李醒飞;杨帆;;激光与光电子学进展;第53卷(第12期);250-259 *
光电跟踪***的激光回波率下界估计;索晓峰 等;兵工学报;第32卷(第07期);832-837 *

Also Published As

Publication number Publication date
CN113740843A (en) 2021-12-03

Similar Documents

Publication Publication Date Title
CN108802722B (en) It is a kind of based on tracking before the Faint target detection virtually composed
CN106054169B (en) Multistation Radar Signal Fusion detection method based on tracking information
CN113740843B (en) Motion state estimation method and system for tracking target and electronic device
CN109754406B (en) Lithium battery pole piece burr detection device and method based on two-dimensional contourgraph
CN105842687A (en) Detection tracking integrated method based on RCS prediction information
CN103047982B (en) Adaptive target tracking method based on angle information
CN103809173A (en) Detection and tracking integration method for frame constant false-alarm target
CN110161494B (en) RD plane weak target detection tracking method and device based on velocity square filtering
CN107436434B (en) Track starting method based on bidirectional Doppler estimation
CN109991597A (en) Weak-expansion-target-oriented tracking-before-detection method
CN110187335A (en) Tracking before being detected for the particle filter with discontinuous characteristic target
CN111623703A (en) Novel Kalman filtering-based Beidou deformation monitoring real-time processing method
RU2724115C1 (en) Method for automatic tracking of a mobile target when smoothing in cartesian coordinates taking into account radial velocity component measurements
CN110133612A (en) A kind of extension target detection method based on tracking feedback
CN106597122B (en) A kind of pulse width detection algorithm of radar and signal of communication
CN105652256B (en) A kind of high-frequency ground wave radar TBD methods based on polarization information
RU2556024C2 (en) Moving target coordinates combined smoothing
Wu et al. Detection and tracking of moving target behind wall using UWB through-wall radar
CN114488104B (en) Sky wave beyond-view range radar target tracking method based on interaction consistency
CN111830488B (en) Echo signal data processing method and system based on GM-APD laser ranging system
CN110244289A (en) A kind of adaptive particle filter ground wave radar target integrative detection method
CN116359906A (en) Automatic starting method for cross-period target morphology quality consistency inspection radar target
CN109655057A (en) A kind of six push away the filtering optimization method and its system of unmanned plane accelerator measured value
RU2551356C1 (en) Method of non-strobe automatic tracking of mobile target
CN114861725A (en) Post-processing method, device, equipment and medium for perception and tracking of target

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant