CN106707276B - Precession target radar three-dimensional imaging method based on sliding window EHT - Google Patents

Precession target radar three-dimensional imaging method based on sliding window EHT Download PDF

Info

Publication number
CN106707276B
CN106707276B CN201611182469.7A CN201611182469A CN106707276B CN 106707276 B CN106707276 B CN 106707276B CN 201611182469 A CN201611182469 A CN 201611182469A CN 106707276 B CN106707276 B CN 106707276B
Authority
CN
China
Prior art keywords
target
sinusoidal signal
scattering point
amplitude
scattering
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611182469.7A
Other languages
Chinese (zh)
Other versions
CN106707276A (en
Inventor
段锐
张娜
何婷婷
颜光宇
黄勇
张海
汪学刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN201611182469.7A priority Critical patent/CN106707276B/en
Publication of CN106707276A publication Critical patent/CN106707276A/en
Application granted granted Critical
Publication of CN106707276B publication Critical patent/CN106707276B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a precession target radar three-dimensional imaging method based on a sliding window EHT, which comprises the following steps of firstly, separating echo signals of all scattering points which change in a sine function from target signals; then, estimating the period parameter and the mean value parameter of the sinusoidal signal to obtain the precession period of the precession target; then, carrying out sliding window EHT processing on each scattering point signal, and accurately estimating the amplitude and initial phase parameters of the sinusoidal signal; subsequently, calculating the amplitude and the phase of the sinusoidal signal of the scattering point at different moments; and finally, mapping the sinusoidal signal parameters of each scattering point to the actual position space of the target through coordinate conversion, and reconstructing a three-dimensional image of the target. Compared with the existing spinning target imaging method, the method can not only carry out three-dimensional imaging on the spinning target, but also realize three-dimensional radar imaging on the precession target. Meanwhile, compared with a GRT-based three-dimensional imaging method, the method does not need to construct a reference sinusoidal signal curve about scattering points before processing.

Description

Precession target radar three-dimensional imaging method based on sliding window EHT
Technical Field
The invention belongs to the technical field of radar imaging, and particularly relates to a design of a precession target radar three-dimensional imaging method based on a sliding window EHT.
Background
The method can obtain information about the shape, structure, size, motion type, state and the like of the aerial target from the three-dimensional image of the precession target, and can be applied to occasions needing to identify, monitor and early warn the aerial target, such as airports, ports, large-scale activity meeting places, military bases and the like. The key of the precession target three-dimensional imaging is to recover the instantaneous spatial position information of the target according to the radar echo signal. The radar echo of the precession target is composed of scattering signals of a plurality of scattering points on the target, each scattering point echo signal presents a sine change rule on a distance-time plane, and the amplitude, the period, the phase and the average value of the sine signal respectively describe the distance change range, the precession period, the instantaneous position and the average position of the scattering points in the space, so that the three-dimensional imaging problem of the precession target can be converted into the estimation problem of the sine signal parameters.
The existing radar Three-dimensional Imaging algorithm mainly solves the problem of Three-dimensional Imaging of a Spinning Target, the adopted Imaging method is Generalized Radon Transform (GRT), for example, an IEEE Transactions on Geoscience and removal Sensing,2008,46(1): 22-30) by Qi Wang et al proposes a Spinning point Target Three-dimensional Imaging algorithm Based on the GRT-C L EAN technology, furthermore, an IEEE Transactions on Moving Target objects Based on the hollow Sensing and removal Sensing,2008,46(1): 291) also proposes an EHT-Based radar Three-dimensional Imaging method, which is still a two-dimensional Imaging method for the Spinning Target 299.
Disclosure of Invention
The invention aims to solve the problem that an effective method for realizing three-dimensional radar imaging aiming at a precession target is lacked in the prior art, and provides a precession target radar three-dimensional imaging method based on Extended Hough Transform (EHT).
The technical scheme of the invention is as follows: a precession target radar three-dimensional imaging method based on a sliding window EHT comprises the following steps:
s1, separating echo signals of each scattering point changing in a sine function from the target signals;
s2, estimating the periodic parameters of the sinusoidal signal to obtain the precession period of the precession target;
s3, estimating the mean value parameter of the sinusoidal signal;
s4, performing sliding window EHT processing on the scattering point signals, and estimating the amplitude and initial phase parameters of the sinusoidal signals;
s5, calculating the amplitude and the phase of the sinusoidal signal of the scattering point at different moments;
and S6, mapping the sinusoidal signal parameters of each scattering point into the actual position space of the target through coordinate conversion, and reconstructing a three-dimensional image of the target.
Further, step S1 includes the following substeps:
s11, assuming that the radar echo signal of the precession target is
Figure BDA0001185431270000021
Where L denotes the number of dominant scattering points contained by the target, n denotes the number of fast-time samples, and the sample interval is Δ tsAnd N is the total number of fast time samples; t represents a slow time sampling number, the sampling interval being equal to the radar's waveform repetition period TrAnd t is 1, M is the total number of slow time samples;
s12, performing matched filtering processing to obtain the distance-slow time domain signal of the target
Figure BDA0001185431270000022
r denotes the distance gate position, the distance resolution is Δ r, and r ═ 1rThe actual distance observation range is Rr=KrΔ r, t represents a slow time sampling number, and S (r, t) is a distance-slow time domain image I (r, t) of the target;
s13, separating the images I (r, t) by using an empirical mode decomposition algorithm to obtain a distance-slow time domain image I corresponding to each scattering pointl(r, t) where l 1.., L, each image Il(r, t) comprises a sine curve which represents that the scattering point signal changes in a sine rule in a distance-slow time domain, and the polar coordinate expression of the sine curve is
Figure BDA0001185431270000023
Wherein A isl、wl
Figure BDA0001185431270000024
r0,lThe amplitude, angular velocity, initial phase and mean value of the scattering point l are respectively.
Further, step S2 includes the following substeps:
s21, calculating image data I of scattering point l according to formula (1)l(r, t) autocorrelation sequence:
Figure BDA0001185431270000025
wherein CCRl(M), M1.., M is the autocorrelation sequence of the scattering point l, M represents the sample delay;
s22 and detection of autocorrelation sequence CCRlEstimating the average sampling point interval P between the main peak values to obtain an image IlThe period of the sinusoidal signal in (r, T) is Tl=P×TrThe precession angular frequency of the scattering point l is wl=2π/Tl
S23, repeating the steps S21-S22, estimating the periods of the rest L-1 scattering point sinusoidal signals, and taking the precession period T of the target as the period T of all the scattering point sinusoidal signalslAverage value of (d):
Figure BDA0001185431270000026
the angular velocity w of the target is taken as the sinusoidal angular velocity w of all scattering pointslAverage value of (d):
Figure BDA0001185431270000027
further, step S3 includes the following substeps:
s31, from image Il(r, T) intercepting an image I with a length of one precession period Tl'(r,t');
Wherein t ═ P0,P0+1,...,P0+ P-1, P being the number of samples corresponding to the precession period T, P0As an image Il'(r, t') starting spot position;
s32, estimating the mean r of the sinusoidal signal of the scattering point l according to the formula (4)0,l
Figure BDA0001185431270000031
S33, repeating the steps S31-S32, and estimating the mean value parameter r of the sinusoidal signals of the rest L-1 scattering points0,lRepresents the scattering point l in the image IlAverage value in (r, t).
Further, step S4 includes the following substeps:
s41, determining the length and the stepping amount of the time sliding window: setting the length of a sliding window as P and the step amount of window sliding as delta P; for image I with time sample length Ml(r, t), K ═ floor ((M-P)/Δ P) treatment windows can be formed in total; let the signal in the k window be Il,k(r,tk) Wherein t isk(K-1) Δ P + j, j is the in-window sample number and j is 1,.., P, K is the sliding window number and K is 1,.., K;
s42, according to the image Il(r, t), setting the search range of the amplitude parameter to be 0, A0,l]And if the search step is Δ a, the amplitude value to be searched is: a. theuU · Δ a, where U ═ 0,10,lA,/Δ A; setting the search range of initial phase parameter as [0,2 pi ]]The search step is
Figure BDA0001185431270000032
Then the initial phase value to be searched is:
Figure BDA0001185431270000033
wherein V is 0,1,. cndot., V,
Figure BDA0001185431270000034
s43, setting a search result accumulator matrix [ Q]uv=quvU, 0,1,., U, V, 0,1,.,., V; initializing matrix elements quv=0;
S44, EHT processing is carried out on the data in the kth sliding window: will Il,k(r,tk) The upper sampling point is mapped to the amplitude and initial phase parameter space of the sinusoidal signal from a distance-slow time domain, and the EHT-based mapping method comprises the following steps:
Figure BDA0001185431270000035
in the formula rk,jIs the sample point j of the sinusoidal signal within the window k, w is the angular velocity of the precessional object estimated at step S2, r0,lIs the mean value of the sinusoidal signal estimated at step S3;
s45, using formula (5) for each sample r of window kk,jSearching for corresponding initial phase values
Figure BDA0001185431270000036
V1.. times.v corresponds to the sinusoidal signal amplitude ak,j(ii) a Judgment Ak,jWithin which amplitude cell: a. theu≤Ak,j<Au+1Record the corresponding amplitude AuAnd the initial phase
Figure BDA0001185431270000041
Unit, to matrix element quvAnd (3) performing accumulation operation: q. q.suv=quv+1;
S46, peak value detection is carried out on the calculation result of the accumulator matrix, and the maximum peak value
Figure BDA0001185431270000042
Position (u) of0,v0) Corresponding to the amplitude A of the sinusoidal signal in the window kk=u0Δ A and initial phase
Figure BDA0001185431270000043
S47, repeating the steps S43-S46 to perform EHT processing on the data of the rest K-1 windows to obtain a set of sliding window estimation sequences of amplitude and initial phase values of the scattering point l: { A k1,. K } and
Figure BDA0001185431270000044
the amplitude parameter A of the sinusoidal signal of the scattering point llAnd initial phase parameters
Figure BDA0001185431270000045
Respectively, corresponding to the mean of the sliding window estimation sequence
Figure BDA0001185431270000046
And
Figure BDA0001185431270000047
and S48, repeating the steps S41-S47, and estimating the amplitudes and initial phase parameters of the sine signals of the rest L-1 scattering points.
Further, step S5 is specifically: from the sinusoidal signal expression for the scattering point l:
Figure BDA0001185431270000048
the amplitude value corresponding to M times is al(t)=AlPhase value of
Figure BDA0001185431270000049
Further, step S6 is specifically: mapping the parameters of the sinusoidal signal of the estimated scattering point l into a rectangular coordinate system, so that the spatial rectangular coordinate of the ith scattering point at the time t is as follows:
Figure BDA00011854312700000410
according to all coordinates x of L scattering points at different timel(t),yl(t),zl(t)]And three-dimensional images of the target at different moments can be reconstructed.
The invention has the beneficial effects that:
(1) the invention separates the signal curves of all scattering points by an empirical mode decomposition method, thereby avoiding the mutual influence among the scattering points when estimating the parameters of the sine curve.
(2) The invention can accurately estimate the period, the average value, the amplitude and the initial phase parameters of the sinusoidal signal.
(3) The method uses the sliding window EHT to process and estimate the amplitude and initial phase parameters of the sinusoidal signal, needs less prior information, has high estimation precision, and avoids the requirement that an ideal sinusoidal curve needs to be constructed in the existing imaging method based on the generalized Radon transformation.
(4) The invention can obtain the expression of the sinusoidal signal of the scattering point by utilizing the cycle, the average value, the amplitude and the initial phase parameter obtained by estimation, calculate the sinusoidal signal value at different moments, and reconstruct the three-dimensional image of the precession target at different moments, and the result can be used for analyzing the information of the shape, the position, the precession state and the like of the target.
Drawings
Fig. 1 is a diagram of a radar imaging scene with respect to a precession target according to an embodiment of the present invention.
FIG. 2 is a schematic diagram of the positions of the target and the main scattering points according to an embodiment of the present invention.
Fig. 3 is a flowchart of a precession target radar three-dimensional imaging method based on a sliding window EHT provided by the present invention.
FIG. 4 is a range-slow time domain image of a precession target echo according to an embodiment of the present invention.
Fig. 5 is a 1 st scattering point image obtained after echo signal separation according to an embodiment of the present invention.
Fig. 6 is a schematic diagram of the peak value and the position of the autocorrelation sequence according to an embodiment of the present invention.
Fig. 7 is a schematic diagram illustrating a window sliding method during sliding window processing according to an embodiment of the present invention.
FIG. 8 is a diagram illustrating the result of estimating the amplitude and initial phase parameters of the sinusoidal signal at the scattering point 2 according to the embodiment of the present invention.
Fig. 9 is a three-dimensional image of a reconstructed precession object at times t1, 61,121 and 181, respectively, according to an embodiment of the present invention.
Detailed Description
The embodiments of the present invention will be further described with reference to the accompanying drawings.
Embodiments of the invention relate to precession targetsThe target is a cone, which contains L-5 main scattering points, and the coordinates of the scattering points in the local coordinate system of the target are respectively P1(0m,0m,1m)、P2(0.5m,0m,-0.5m)、P3(0m,0.5m,-0.5m)、P4(-0.5m,0m, -0.5m) and P5(0m, -0.5m, -0.5m), i.e.: scattering Point P1At the apex of the cone, P2、P3、P4And P5On the bottom surface of the cone, as shown in fig. 2. The precession state of the target is represented in a reference coordinate system O-XYZ, in which the Z axis is the precession axis, the precession angle is θ, and is rotated about the Z axis at an angular velocity of 4 π rad/s. The distance range observed by radar is set as Rr5m, distance resolution Δ r 0.05m, distance elements Kr=Rr/Δr=100。
The invention provides a precession target radar three-dimensional imaging method based on a sliding window EHT, which comprises the following steps as shown in figure 3:
and S1, separating each scattering point echo signal which changes in a sine function form the target signal.
S11, the radar echo signal of the target is
Figure BDA0001185431270000051
Wherein L equals 5, fast time N equals 1,.. times, N equals 100, slow time T equals 1,.. times, M equals 1000, and the time interval is Tr=0.002s。
S12, performing matched filtering processing to obtain a target signal in a distance-slow time domain
Figure BDA0001185431270000052
Wherein the distance unit r is 1r,Kr100, the slow time sequence number t is 1, the corresponding image in the distance-slow time domain is I (r, t), as shown in fig. 4.
S13, separating I (r, t) into images I corresponding to L-5 scattering points by an empirical mode decomposition methodl(r, t), 1 ≦ l ≦ L, image I where the scattering point l ≦ 11(r, t) is shown in FIG. 5.
And S2, estimating the period parameter of the sinusoidal signal to obtain the precession period of the precession target.
S21, calculating the 1 st scattering point image data I according to the formula (1)1Autocorrelation sequence CCR of (r, t)l(M), M ═ 1.., M, the calculation results are shown in fig. 6.
S22, detecting the peak value of the autocorrelation sequence to obtain the peak value interval xi202,452,702,952, the average interval corresponds to a sample number P of 250, so that the precession period T of the 1 st scattering point can be determined1=P×Tr0.5s, precession angular velocity w of the 1 st scattering point1=4πrad/s。
S23, repeating the steps S21-S22, estimating the periods of the sinusoidal signals of the other 4 scattering points, wherein the precession period T of the target is the average value of the periods of the sinusoidal signals of all the scattering points:
Figure BDA0001185431270000061
the angular velocity w of the target is the average of the angular velocities of the sinusoidal signals at all scattering points:
Figure BDA0001185431270000062
and S3, estimating the mean parameter of the sinusoidal signal.
S31, Slave I1(r, t) intercepting an image I of one precession period length1' (r, t '), wherein t ' ═ P0,P0+1,...,P0+ P-1, where P is taken00 and P250.
S32, estimating the mean value r of the sinusoidal signal of the 1 st scattering point0,1The sinusoidal signal mean is:
Figure BDA0001185431270000063
wherein r is0,1In the 56 th range bin.
S33, repeating the steps S31-S32, and estimating the mean value parameters of the sinusoidal signals of the rest 4 scattering points: r is0,2=-1.45,r0,3=-1.64,r0,41.63 and r0,5-1.44, in the 79 th, 83 th, 79 th distance units, respectively.
And S4, carrying out sliding window EHT processing on each scattering point signal, and estimating the amplitude and initial phase parameters of the sinusoidal signal.
S41, determining the length and the stepping amount of the time sliding window: the length of the sliding window is set to P-250 and the step size of the window sliding is set to Δ P-60.
For image I with time sample length M of 10001(r, t), a total of K ═ floor ((M-P)/Δ P) ═ 12 treatment windows can be formed. Let the signal in the kth window of the 1 st scattering point be I1,k(r,tk) Wherein t iskIn some embodiments, (K-1) Δ P + j, j is the in-window sample number and j is 1. The sliding window process is shown in fig. 7.
S42, according to the image I1(r, t), setting the search range of the amplitude parameter as follows: [0,0.5]And if the search step size is Δ a equal to 0.01, the search values of the amplitudes are: a. theu0.01U, and U is 0,1,.., U is 50; setting the search range of the initial phase parameters as follows: [0,2 π]The search step is
Figure BDA00011854312700000710
Figure BDA00011854312700000711
And u is 0,1, 62.
S43, setting a search result accumulator matrix [ Q]uv=quvU is 0, 1.., 50, v is 0,1,.., 62; initializing matrix elements quv=0。
S44, EHT processing is carried out on the data in the 1 st sliding window: will I1,1(r,t1) The upper samples are mapped from the range-slow time domain to the amplitude and initial phase parameter space of the sinusoidal signal. The EHT-based mapping method comprises the following steps:
Figure BDA0001185431270000071
in the above formula r1,jFor the samples j of the sinusoidal signal in window 1, w is the angular velocity of the precessional object estimated in step S2, r0,1Is the mean value of the 1 st scattering point sinusoidal signal estimated at step S3.
S45, sampling each point r of window 1 according to formula (5)1,jSearching each initial phase value
Figure BDA0001185431270000072
v
0, 1.. times.62 corresponds to a sinusoidal signal amplitude a1,jIf 0 is not more than A1,j< 0.5, the corresponding amplitude A is recordeduAnd the initial phase
Figure BDA0001185431270000073
Unit, to matrix element quvAnd (3) performing accumulation operation: q. q.suv=quv+1。
S46, peak detection is performed on the calculation result of the accumulator matrix, and the maximum peak value max (q) is q16,53Corresponds to the amplitude A of the sinusoidal signal in the 1 st window10.01 × 16 ═ 0.16 and initial phase
Figure BDA0001185431270000074
S47, repeating the steps S43-S46 to perform EHT processing on each sliding window data, and obtaining a group of amplitude and initial phase value sequences related to the scattering point l:
Figure BDA0001185431270000075
the amplitude parameter of the sinusoidal signal of the 1 st scattering point is
Figure BDA0001185431270000076
Initial phase parameter
Figure BDA0001185431270000077
S48, repeating the steps S41-S47, estimating the amplitudes and initial phase parameters of the sinusoidal signals of the rest 4 scattering points, and obtaining the results of the 4 previous sliding window EHT simulation of the 2 nd scattering point as shown in FIG. 8.
And S5, calculating the amplitude and the phase of the sinusoidal signal of the scattering point at different moments.
From the sinusoidal signal expression for the scattering point l:
Figure BDA0001185431270000078
taking t as 1, a, and M, wherein the amplitude value corresponding to each moment is Al(t)=AlPhase value of
Figure BDA0001185431270000079
And S6, mapping the sinusoidal signal parameters of each scattering point into the actual position space of the target through coordinate conversion, and reconstructing a three-dimensional image of the target.
And mapping the parameters of the estimated sinusoidal signal of the scattering point l into a rectangular coordinate system, wherein the rectangular position coordinate of the ith scattering point in the space at the time t is as follows:
Figure BDA0001185431270000081
according to the coordinates [ x ] of 5 scattering points at different timel(t),yl(t),zl(t)]Three-dimensional images of the object at different times can be reconstructed, and the imaging results at times t1, 61,121 and 181 are shown in fig. 9. Fig. 9 shows the position and shape of an actual target, and the precession state of the target can be observed from a three-dimensional imaging sequence, thereby verifying the effectiveness of the method of the invention.
It will be appreciated by those of ordinary skill in the art that the embodiments described herein are intended to assist the reader in understanding the principles of the invention and are to be construed as being without limitation to such specifically recited embodiments and examples. Those skilled in the art can make various other specific changes and combinations based on the teachings of the present invention without departing from the spirit of the invention, and these changes and combinations are within the scope of the invention.

Claims (7)

1. A precession target radar three-dimensional imaging method based on sliding window expansion Hough transform is characterized by comprising the following steps:
s1, separating echo signals of each scattering point changing in a sine function from the target signals;
s2, estimating the periodic parameters of the sinusoidal signal to obtain the precession period of the precession target;
s3, estimating the mean value parameter of the sinusoidal signal;
s4, performing sliding window expansion Hough transform processing on each scattering point signal, and estimating the amplitude and initial phase parameters of the sinusoidal signal;
s5, calculating the amplitude and the phase of the sinusoidal signal of the scattering point at different moments;
and S6, mapping the sinusoidal signal parameters of each scattering point into the actual position space of the target through coordinate conversion, and reconstructing a three-dimensional image of the target.
2. The method according to claim 1, wherein the step S1 includes the following substeps:
s11, assuming that the radar echo signal of the precession target is
Figure FDA0002428537610000011
Where L denotes the number of dominant scattering points contained by the target, n denotes the number of fast-time samples, and the sample interval is Δ tsAnd N is the total number of fast time samples; t represents a slow time sampling number, the sampling interval being equal to the radar's waveform repetition period TrAnd t is 1, M is the total number of slow time samples;
s12, performing matched filtering processing to obtain the distance-slow time domain signal of the target
Figure FDA0002428537610000012
r denotes the distance gate position, the distance resolution is Δ r, and r ═ 1rThe actual distance observation range is Rr=KrΔ r, t represents a slow time sampling number, and S (r, t) is a distance-slow time domain image I (r, t) of the target;
s13, separating the images I (r, t) by using an empirical mode decomposition algorithm, and obtaining a distance-slow time domain image corresponding to each scattering pointIl(r, t) where l 1.., L, each image Il(r, t) comprises a sine curve which represents that the scattering point signal changes in a sine rule in a distance-slow time domain, and the polar coordinate expression of the sine curve is
Figure FDA0002428537610000013
Wherein A isl、wl
Figure FDA0002428537610000014
r0,lThe amplitude, angular velocity, initial phase and mean value of the scattering point l are respectively.
3. The method according to claim 2, wherein the step S2 includes the following substeps:
s21, calculating image data I of scattering point l according to formula (1)l(r, t) autocorrelation sequence:
Figure FDA0002428537610000015
wherein CCRl(M), M1.., M is the autocorrelation sequence of the scattering point l, M represents the sample delay;
s22 and detection of autocorrelation sequence CCRlEstimating the average sampling point interval P between the main peak values to obtain an image IlThe period of the sinusoidal signal in (r, T) is Tl=P×TrThe precession angular frequency of the scattering point l is wl=2π/Tl
S23, repeating the steps S21-S22, estimating the periods of the rest L-1 scattering point sinusoidal signals, and taking the precession period T of the target as the period T of all the scattering point sinusoidal signalslAverage value of (d):
Figure FDA0002428537610000021
the angular velocity w of the target is taken as the sinusoidal angular velocity w of all scattering pointslAverage value of (d):
Figure FDA0002428537610000022
4. the method according to claim 3, wherein the step S3 includes the following substeps:
s31, from image Il(r, T) image I 'with length of one precession period T is cut out'l(r,t');
Wherein t ═ P0,P0+1,...,P0+ P-1, P being the number of samples corresponding to the precession period T, P0Is picture I'l(r, t') starting spot position;
s32, estimating the mean r of the sinusoidal signal of the scattering point l according to the formula (4)0,l
Figure FDA0002428537610000023
S33, repeating the steps S31-S32, and estimating the mean value parameter r of the sinusoidal signals of the rest L-1 scattering points0,lRepresents the scattering point l in the image IlAverage value in (r, t).
5. The method according to claim 4, wherein the step S4 includes the following substeps:
s41, determining the length and the stepping amount of the time sliding window: setting the length of a sliding window as P and the step amount of window sliding as delta P; for image I with time sample length Ml(r, t), K ═ floor ((M-P)/Δ P) treatment windows can be formed in total; let the signal in the k window be Il,k(r,tk) Wherein t isk(K-1) Δ P + j, j is the in-window sample number and j is 1,.., P, K is the sliding window number and K is 1,.., K;
s42, according to the image Il(r, t), setting the search range of the amplitude parameter to be 0, A0,l]And if the search step is Δ a, the amplitude value to be searched is:Auu · Δ a, where U ═ 0,10,lA,/Δ A; setting the search range of initial phase parameter as [0,2 pi ]]The search step is
Figure FDA0002428537610000031
Then the initial phase value to be searched is:
Figure FDA0002428537610000032
wherein V is 0,1,. cndot., V,
Figure FDA0002428537610000033
s43, setting a search result accumulator matrix [ Q]uv=quvU, 0,1,., U, V, 0,1,.,., V; initializing matrix elements quv=0;
S44, performing expansion Hough transform processing on the data in the kth sliding window: will Il,k(r,tk) The upper sampling point is mapped to the amplitude and initial phase parameter space of the sinusoidal signal from a distance-slow time domain, and the mapping method based on the extended Hough transform comprises the following steps:
Figure FDA0002428537610000034
in the formula rk,jIs the sample point j of the sinusoidal signal within the window k, w is the angular velocity of the precessional object estimated at step S2, r0,lIs the mean value of the sinusoidal signal estimated at step S3;
s45, using formula (5) for each sample r of window kk,jSearching for corresponding initial phase values
Figure FDA0002428537610000035
V1.. times.v corresponds to the sinusoidal signal amplitude ak,j(ii) a Judgment Ak,jWithin which amplitude cell: a. theu≤Ak,j<Au+1Record the corresponding amplitude AuAnd the initial phase
Figure FDA0002428537610000036
Unit, to matrix element quvAnd (3) performing accumulation operation: q. q.suv=quv+1;
S46, peak value detection is carried out on the calculation result of the accumulator matrix, and the maximum peak value
Figure FDA0002428537610000037
Position (u) of0,v0) Corresponding to the amplitude A of the sinusoidal signal in the window kk=u0Δ A and initial phase
Figure FDA0002428537610000038
S47, repeating the steps S43-S46 to perform extended Hough transform processing on the data of the rest K-1 windows to obtain a sliding window estimation sequence of a group of amplitude and initial phase values of the scattering point l: { Ak1,. K } and
Figure FDA0002428537610000039
the amplitude parameter A of the sinusoidal signal of the scattering point llAnd initial phase parameters
Figure FDA00024285376100000310
Respectively, corresponding to the mean of the sliding window estimation sequence
Figure FDA00024285376100000311
And
Figure FDA00024285376100000312
and S48, repeating the steps S41-S47, and estimating the amplitudes and initial phase parameters of the sine signals of the rest L-1 scattering points.
6. The method according to claim 5, wherein the step S5 is specifically as follows: from the sinusoidal signal expression for the scattering point l:
Figure FDA00024285376100000313
the amplitude value corresponding to M times is al(t)=AlPhase value of
Figure FDA00024285376100000314
7. The method according to claim 6, wherein the step S6 is specifically as follows: mapping the parameters of the sinusoidal signal of the estimated scattering point l into a rectangular coordinate system, so that the spatial rectangular coordinate of the ith scattering point at the time t is as follows:
Figure FDA0002428537610000041
according to all coordinates x of L scattering points at different timel(t),yl(t),zl(t)]And three-dimensional images of the target at different moments can be reconstructed.
CN201611182469.7A 2016-12-20 2016-12-20 Precession target radar three-dimensional imaging method based on sliding window EHT Active CN106707276B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611182469.7A CN106707276B (en) 2016-12-20 2016-12-20 Precession target radar three-dimensional imaging method based on sliding window EHT

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611182469.7A CN106707276B (en) 2016-12-20 2016-12-20 Precession target radar three-dimensional imaging method based on sliding window EHT

Publications (2)

Publication Number Publication Date
CN106707276A CN106707276A (en) 2017-05-24
CN106707276B true CN106707276B (en) 2020-07-31

Family

ID=58939333

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611182469.7A Active CN106707276B (en) 2016-12-20 2016-12-20 Precession target radar three-dimensional imaging method based on sliding window EHT

Country Status (1)

Country Link
CN (1) CN106707276B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113589281B (en) * 2020-04-30 2024-07-09 北京理工大学重庆创新中心 GEO SAR ship target imaging method based on micro Doppler analysis
CN114185047B (en) * 2021-12-09 2023-06-27 电子科技大学 Double-base SAR moving target refocusing method based on optimal polar coordinate transformation

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10282231A (en) * 1997-04-02 1998-10-23 Nec Corp Radar system
CN104330784A (en) * 2014-11-19 2015-02-04 西安电子科技大学 Plane target classification method based on rotor wing physical parameter estimation
CN105674814A (en) * 2016-01-12 2016-06-15 中国人民解放军国防科学技术大学 Target micro-motion feature extraction method used for estimating space warhead procession period

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10282231A (en) * 1997-04-02 1998-10-23 Nec Corp Radar system
CN104330784A (en) * 2014-11-19 2015-02-04 西安电子科技大学 Plane target classification method based on rotor wing physical parameter estimation
CN105674814A (en) * 2016-01-12 2016-06-15 中国人民解放军国防科学技术大学 Target micro-motion feature extraction method used for estimating space warhead procession period

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于短时迭代自适应-逆Radon变换的微多普勒提取方法;赵彤璐等;《电子学报》;20160315;第44卷(第03期);505-513 *
基于进动的旋转对称弹头雷达成像方法;刘进等;《信号处理》;20090925;第25卷(第09期);1333-1337 *

Also Published As

Publication number Publication date
CN106707276A (en) 2017-05-24

Similar Documents

Publication Publication Date Title
Ni et al. Visual tracking using neuromorphic asynchronous event-based cameras
CN106569194B (en) A kind of interference formula three-dimensional imaging of wideband radar space cone target and fine motion feature extracting method
KR20170015306A (en) Method of tracking shape in a scene observed by an asynchronous light sensor
CN102494675B (en) High-speed visual capturing method of moving target features
CN112130142B (en) Method and system for extracting micro Doppler features of complex moving target
CN109471096B (en) Multi-sensor target matching method and device and automobile
CN106772352B (en) It is a kind of that Weak target detecting method is extended based on the PD radar of Hough and particle filter
CN107516322B (en) Image object size and rotation estimation calculation method based on log polar space
CN110187337B (en) LS and NEU-ECEF space-time registration-based high maneuvering target tracking method and system
CN105447867B (en) Spatial target posture method of estimation based on ISAR images
CN106707276B (en) Precession target radar three-dimensional imaging method based on sliding window EHT
CN104880160A (en) Two-dimensional-laser real-time detection method of workpiece surface profile
CN111401180B (en) Neural network recognition model training method, device, server and storage medium
CN103927784B (en) A kind of active 3-D scanning method
JP6733060B2 (en) Inverse Synthetic Aperture Radar for Vehicle Radar Systems
CN107203271B (en) Double-hand recognition method based on multi-sensor fusion technology
Rana et al. Position and velocity estimations of 2D-moving object using Kalman filter: Literature review
Wang et al. Runway detection and tracking for unmanned aerial vehicle based on an improved canny edge detection algorithm
Li et al. Three-dimensional reconstruction using ISAR sequences
CN109031339A (en) A kind of three-dimensional point cloud motion compensation process
CN110850386A (en) Rotor wing type unmanned aerial vehicle deep learning identification method based on fractional order domain features
Ge et al. Tracking video target via particle filtering on manifold
Xiao et al. Multi-target ISAR imaging based on image segmentation and short-time Fourier transform
CN110161500B (en) Improved circular SAR three-dimensional imaging method based on Radon-Clean
Huang et al. A rd-t network for hand gesture recognition based on millimeter-wave sensor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant