CN109254271B - Static target suppression method for ground monitoring radar system - Google Patents

Static target suppression method for ground monitoring radar system Download PDF

Info

Publication number
CN109254271B
CN109254271B CN201811001833.4A CN201811001833A CN109254271B CN 109254271 B CN109254271 B CN 109254271B CN 201811001833 A CN201811001833 A CN 201811001833A CN 109254271 B CN109254271 B CN 109254271B
Authority
CN
China
Prior art keywords
target
detection information
radar
target detection
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811001833.4A
Other languages
Chinese (zh)
Other versions
CN109254271A (en
Inventor
杨琳
黄涛
吴俊�
陶少杰
任翔
陈龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Sun Create Electronic Co Ltd
Original Assignee
Anhui Sun Create Electronic Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Sun Create Electronic Co Ltd filed Critical Anhui Sun Create Electronic Co Ltd
Priority to CN201811001833.4A priority Critical patent/CN109254271B/en
Publication of CN109254271A publication Critical patent/CN109254271A/en
Application granted granted Critical
Publication of CN109254271B publication Critical patent/CN109254271B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/415Identification of targets based on measurements of movement associated with the target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/52Discriminating between fixed and moving objects or between objects moving at different speeds

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a static target suppression method for a ground monitoring radar system, which comprises the following steps: acquiring target detection information processed by a radar front end; performing interframe accumulation processing on the target detection information, and forming a two-dimensional image according to the accumulated target detection information; extracting target edges of the two-dimensional images to obtain extracted target detection information belonging to the same target; carrying out static target area judgment processing on target detection information belonging to the same target, and marking a static target area range; and performing suppression processing on the target detection information in the static target area range to obtain the target detection information after static target suppression. The method solves the problem that the ground monitoring radar system cannot accurately mark out the static target area due to the change of the detection scene and the erection position, improves the inhibition speed of the static target and the detection precision of the moving target, and provides a reliable data base for the subsequent track processing.

Description

Static target suppression method for ground monitoring radar system
Technical Field
The invention relates to the technical field of real-time suppression of stationary targets of ground monitoring radar systems, in particular to a stationary target suppression method for a ground monitoring radar system.
Background
The ground monitoring radar system is an integrated system of radar and photoelectric images, monitors ground targets and low-altitude targets including unmanned aerial vehicles and helicopters in all weather, has the capability of target classification, carries out comprehensive judgment by combining target echo intensity, target movement speed and photoelectric video information, identifies targets as single person, multiple persons, single vehicles and airplanes, monitors and tracks the movement of the targets, and can master and display the position and movement track of the targets.
The static target suppression method is a method for suppressing a static target and providing a reliable data base for subsequent moving target track processing.
In a conventional method for suppressing a stationary target by manually defining a stationary target area, a user directly sets the stationary target area on a terminal display interface. However, due to the portability of the ground surveillance radar erection site, when the erection site changes every time, a static target area needs to be set on the terminal display interface again, and even in the same erection site, when the ground surveillance radar performs detection in different detection scenes, the static target area may change with time, so that the traditional method for manually defining the static target area cannot adaptively meet the requirements of the change of the detection scene and the change of the erection site.
Disclosure of Invention
In order to overcome the defects in the prior art, the invention provides a static target inhibition method for a ground surveillance radar system, which solves the problem that the ground surveillance radar system cannot accurately divide a static target area due to the change of a detection scene and an erection position, improves the inhibition speed of the static target, improves the detection precision of a moving target, reduces the output number of the static target, and provides a reliable data base for subsequent track processing.
In order to achieve the purpose, the invention adopts the following technical scheme that:
a stationary target suppression method for a ground surveillance radar system, comprising the steps of:
s1, acquiring target detection information processed by the radar front end;
s2, performing inter-frame accumulation processing on the target detection information, and forming a two-dimensional image from the accumulated target detection information;
s3, performing target edge extraction on the two-dimensional image to obtain extracted target detection information belonging to the same target;
s4, carrying out static target area judgment processing on target detection information belonging to the same target, and marking a static target area range;
and S5, performing suppression processing on the target detection information in the static target area range to obtain the target detection information with suppressed static targets.
In step S1, the target detection information includes: the time stamp of the target, the amplitude value of the target, the distance library of the target and the azimuth of the target;
time stamp of the target: the time when the radar detected the target;
the distance library of the target: a distance library in which the distance of the target detected by the radar is located;
orientation of the target: azimuth angle of the target detected by the radar;
amplitude value of the target: energy value of the target detected by the thunder method;
the distance library is a unit obtained by quantizing the detection distance range of the radar;
in the process of scanning a frame by the radar, the radar detects A azimuths, the range of the detection distance of the radar in each azimuth is quantized into B distance libraries, A is more than or equal to 32 and less than or equal to 64, and B is more than or equal to 1024 and less than or equal to 2000; that is, each direction corresponds to B distance bins, and each distance bin corresponds to a direction.
In step S2, the inter-frame accumulation process includes the steps of:
s21, taking A directions detected by the radar as an x axis of a two-dimensional image, taking B distance libraries obtained by quantifying the range of the radar detection distance as a y axis of the two-dimensional image to form a coordinate axis of the two-dimensional image, wherein the two-dimensional image comprises A multiplied by B pixel points;
s22, accumulating the target detection information of the previous n frames scanned by the radar, wherein n is more than or equal to 50 and less than or equal to 100;
s23, acquiring target detection information of an n +1 th frame scanned by the radar, deleting the target detection information of the 1 st frame scanned by the radar, counting the target detection information of the n-1 frame from the start of the target detection information of the 2 nd frame scanned by the radar to the end of the target detection information of the n th frame scanned by the radar, averaging the amplitude values of the targets in the target detection information of the n-1 frame in a certain distance bank of a certain direction, and taking the average value as the pixel value of a pixel point in the distance bank of the direction on the two-dimensional image; by analogy, sequentially averaging the amplitude values of the targets in the n-1 frames of target detection information in each distance library of each azimuth, sequentially obtaining the pixel value of each pixel point on the two-dimensional image to form a two-dimensional image, and marking the two-dimensional image with an index;
if the radar does not detect the target on a certain distance library of a certain azimuth, the amplitude value of the target in the target detection information on the distance library of the azimuth is 0 db;
the index mark is as follows: each pixel point on the two-dimensional image is marked with an index, and the marking content is the target detection information of the n-1 frame on a certain distance library of a certain direction corresponding to the pixel point on the two-dimensional image, including the amplitude value of the target in the target detection information of each frame in the n-1 frame and the frame number of the frame;
s24, acquiring target detection information of an n +2 th frame scanned by the radar, deleting the target detection information of a 2 nd frame scanned by the radar, counting the target detection information of an n-1 frame from the start of the target detection information of a 3 rd frame scanned by the radar to the end of the target detection information of the n +1 th frame scanned by the radar, sequentially re-averaging the amplitude values of the targets in the target detection information of the n-1 frame in each distance bank of each direction in the mode of the step S23, sequentially re-obtaining the pixel values of each pixel point on the two-dimensional image, updating the two-dimensional image, and marking an index on the two-dimensional image;
and S25, acquiring target detection information of the next frame scanned by the radar in the manner, obtaining the pixel value of each pixel point on the two-dimensional image again, updating the two-dimensional image, and marking an index on the updated two-dimensional image.
In step S3, the target edge extraction includes the following steps:
s31, performing binarization processing on the two-dimensional image to obtain a binary image; the specific method of the binarization processing is as follows: sequentially judging whether the pixel value of each pixel point on the two-dimensional image is greater than or equal to 1db, if so, setting the pixel value of the pixel point on the two-dimensional image to 1 db; if not, setting the pixel value of the pixel point on the two-dimensional image to be 0 db;
s32, sequentially carrying out edge detection on the pixel points on the binary image by using a Sobel edge detection operator to obtain edge pixel points;
the specific method for edge detection is as follows: calculating to-be-detected pixel point Z5G, G ═ f × Gx|+|f*GyIf g is larger than or equal to a set threshold, detecting a pixel point Z5Are edge pixel points;
wherein, the f matrix is formed by pixel points Z to be detected5Pixel value and pixel point Z to be detected5The pixel values of the 8 neighborhood pixel points form a matrix;
Gxand GyTwo groups of 3 x 3 matrixes of Sobel edge detection operators respectively;
Figure BDA0001783152200000041
s33, the edge pixel points are pixel points on the outline belonging to the same target, the pixel points in the outline belonging to the same target are obtained according to the pixel points on the outline belonging to the same target, namely the pixel points on the outline and in the outline belonging to the same target are obtained according to the edge pixel points, the pixel points belonging to the same target are obtained, and then the target detection information of the n-1 frame belonging to the same target is obtained according to the target detection information of the n-1 frame corresponding to each pixel point belonging to the same target on the two-dimensional image;
in step S32, if the pixel to be detected does not have 8 neighborhood pixels, that is, does not form the matrix f, edge detection is not performed on the pixel to be detected that does not have 8 neighborhood pixels, and edge detection is performed on a pixel adjacent to or continuous with the pixel to be detected, and if the pixel to be detected does not have adjacent or continuous pixels, that is, the pixel to be detected is an isolated pixel, the pixel to be detected is not a target pixel.
In step S4, the still target area determination process includes the steps of:
s41, comparing the target direction in the target detection information of n-1 frames belonging to the same target, finding out the maximum target direction alpha2And minimum orientation alpha of the target1,α1、α2Are all within the range of the detection azimuth of the radar; and comparing the distance library of the target in the target detection information of the n-1 frames belonging to the same target to find out the distance library beta of the maximum target2And the smallest distance library beta of objects1,β1、β2Denotes a distance bin number, 1. ltoreq. beta1≤β2B is less than or equal to B, and the coverage range of the target detection information of the target in the azimuth is obtained from the azimuth alpha of the target with the minimum azimuth1Starting to the maximum target orientation alpha2And finishing, obtaining the distance library beta of the target with the smallest coverage range of the target detection information of the target on the distance library1Starting to maximum target range bin beta2Ending and obtaining the area of the coverage area of the target detection information of the target as (alpha)21)×(β21);
S42, according to the number of the amplitude values of the target not 0db in the target detection information of the n-1 frames belonging to the same target and the area (alpha) of the coverage area of the target detection information of the target21)×(β21) Calculating the regional quantization parameter of the target, wherein if the regional quantization parameter of the target is greater than a threshold value, the target is a static target, and the coverage range of the target detection information of the target is a static target region;
the calculation mode of the regional quantization parameter of the target is as follows:
region quantization parameter i/(α) of the target21)×(β21);
Wherein i represents the number of values of the amplitude of the object in the object detection information of n-1 frames belonging to the same object, which are not 0 db.
In step S5, the suppression process includes the steps of:
s51, judging the target detection information processed by the radar front end according to the static target area, if the direction of the target in the target detection information processed by the radar front end is located in the target detection information covering range of the static target in the direction, and the distance library of the target in the target detection information is located in the target detection information covering range of the static target in the distance library, the target detection information is the target detection information of the static target;
s52, in the data sent to the subsequent track processing or terminal display interface, the static target is marked, namely the target detection information of the static target and the target detection information of the non-static target are respectively added with different identification positions for distinguishing; or the target detection information of the static target is not output, namely the target detection information of the static target is not sent, and only the target detection information of the non-static target is sent.
The invention has the advantages that:
(1) the value range of the accumulated statistical frame number n is a value range obtained after balance, the value range ensures that the unmanned aerial vehicle or the low-speed target cannot be judged as a static target in a hovering state, and the effect that the static target is quickly restrained is achieved.
(2) The invention uses the accumulated n frames of target detection information to form a two-dimensional image, and meanwhile, the two-dimensional image is continuously updated, so that the interpretation precision of the static target area is improved, the risk of misinterpretation of the static target area caused by accumulation along with time and changes of a detection scene and a detection position is reduced, and a foundation is provided for the subsequent static target inhibition treatment.
(3) Index marks are made on the two-dimensional images, statistics of historical data is facilitated, and data access efficiency is improved.
(4) The static target area judgment processing of the invention effectively judges the static target and the non-static target.
(5) When the target edge is extracted, the two-dimensional image is subjected to binarization processing, so that the operation amount is reduced, the detection efficiency of edge pixel points is improved, false edges or double edges caused by non-uniform pixel values on the two-dimensional image are reduced, and the probability of target splitting caused by false detection of the target edge is reduced.
(6) The static targets are not output, the output number of the static targets is effectively reduced, the detection precision of the moving targets is improved, and simultaneously, the target detection information of the moving targets is displayed on a terminal display interface.
(7) And the static target is marked, so that a reliable data base is provided for subsequent track processing.
(8) The static target suppression processing of the invention improves the stability and reliability of the ground monitoring radar system and simultaneously improves the suppression efficiency of the static target.
Drawings
FIG. 1 is a flow chart of the method of the present invention.
FIG. 2 is a diagram illustrating the result of accumulating the target detection information for 100 frames according to the present invention.
FIG. 3 is a diagram of the results of the target edge extraction of the present invention.
FIG. 4 is a schematic view of a stationary target area of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, a stationary target suppression method for a ground surveillance radar system includes the following steps:
s1, acquiring target detection information processed by the radar front end;
s2, performing inter-frame accumulation processing on the target detection information, and forming a two-dimensional image from the accumulated target detection information;
s3, performing target edge extraction on the two-dimensional image to obtain extracted target detection information belonging to the same target;
s4, carrying out static target area judgment processing on target detection information belonging to the same target, and marking a static target area range;
and S5, performing suppression processing on the target detection information in the static target area range to obtain the target detection information with suppressed static targets.
In step S1, the target detection information includes: the time stamp of the target, the amplitude value of the target, the distance library of the target and the azimuth of the target;
time stamp of the target: the time when the radar detected the target;
the distance library of the target: a distance library in which the distance of the target detected by the radar is located;
orientation of the target: azimuth angle of the target detected by the radar;
amplitude value of the target: energy value of the target detected by the thunder method;
the distance library is a unit obtained by quantizing the detection distance range of the radar;
in the embodiment, the range of the radar detection azimuth is 0-90 degrees, the due north is 0 degree, the azimuth sampling interval is 1.4 degrees, namely the radar detects 64 azimuths; the range of the detection distance of the radar in each direction is 0 km-30 km, the distance is the distance relative to the position of the radar, the range of the detection distance of the radar is quantized into 2000 distance libraries, and the distance sampling interval is 15 m;
in the process of one frame of radar scanning, the radar detection azimuth comprises 64 azimuths, and the range quantization of radar detection distance is 2000 distance libraries, namely, each azimuth corresponds to 2000 distance libraries, and each distance library corresponds to 64 azimuths.
In step S2, the inter-frame accumulation process includes the steps of:
s21, taking 64 directions detected by the radar as an x axis of a two-dimensional image, taking 2000 distance libraries obtained by quantizing the range of the detection distance of the radar as a y axis of the two-dimensional image to form a coordinate axis of the two-dimensional image, wherein the two-dimensional image comprises 64 x 2000 pixel points;
s22, accumulating the target detection information of the first 100 frames of radar scanning;
s23, acquiring the target detection information of 101 st frame of radar scanning, deleting the target detection information of 1 st frame of radar scanning, counting the target detection information of 99 th frame from the start of the target detection information of 2 nd frame of radar scanning to the end of the target detection information of 100 th frame of radar scanning, averaging the amplitude values of the targets in the target detection information of 99 th frame at a certain distance of a certain direction, and taking the average value as the pixel value of the pixel point on the distance library of the direction on the two-dimensional image; by analogy, sequentially averaging the amplitude values of the targets in the 99 frames of target detection information in each distance library of each azimuth, sequentially obtaining the pixel value of each pixel point on the two-dimensional image to form a two-dimensional image, and marking an index on the two-dimensional image;
if the radar does not detect the target on a certain distance library of a certain azimuth, the amplitude value of the target in the target detection information on the distance library of the azimuth is 0 db;
the index mark is as follows: each pixel point on the two-dimensional image is marked with an index, and the marked content is the target detection information of the 99 frames on a certain distance library of a certain direction corresponding to the pixel point on the two-dimensional image, and comprises the amplitude value of a target in the target detection information of each frame in the 99 frames and the frame number of the frame;
s24, acquiring the 102 th frame of target detection information of radar scanning, deleting the 2 nd frame of target detection information of radar scanning, counting the 99 th frame of target detection information from the 3 rd frame of target detection information of radar scanning to the 101 th frame of target detection information of radar scanning, sequentially re-averaging the amplitude values of the targets in the 99 th frame of target detection information in each distance library of each azimuth in sequence according to the mode of the step S23, sequentially re-obtaining the pixel values of each pixel point on the two-dimensional image, updating the two-dimensional image, and marking an index on the two-dimensional image;
and S25, acquiring target detection information of the next frame scanned by the radar in the manner, obtaining the pixel value of each pixel point on the two-dimensional image again, updating the two-dimensional image, and marking an index on the updated two-dimensional image.
In step S3, the target edge extraction includes the following steps:
s31, performing binarization processing on the two-dimensional image to obtain a binary image; the specific method of the binarization processing is as follows: sequentially judging whether the pixel value of each pixel point on the two-dimensional image is greater than or equal to 1db, if so, setting the pixel value of the pixel point on the two-dimensional image to 1 db; if not, setting the pixel value of the pixel point on the two-dimensional image to be 0 db;
s32, sequentially carrying out edge detection on the pixel points on the binary image by using a Sobel edge detection operator to obtain edge pixel points;
the specific method for edge detection is as follows: calculating to-be-detected pixel point Z5G, G ═ f × Gx|+|f*GyIf g is greater than or equal to the set threshold, in this embodiment, the value of the set threshold is 1, then the pixel point Z to be detected is determined5Are edge pixel points;
wherein, the f matrix is formed by pixel points Z to be detected5Pixel value and pixel point Z to be detected5The pixel values of the 8 neighborhood pixel points form a matrix;
Gxand GyTwo groups of 3 x 3 matrixes of Sobel edge detection operators respectively;
Figure BDA0001783152200000091
s33, the edge pixel points are pixel points on the outline belonging to the same target, pixel points in the outline belonging to the same target are obtained according to the pixel points on the outline belonging to the same target, namely the pixel points on the outline and in the outline belonging to the same target are obtained according to the edge pixel points, the pixel points belonging to the same target are obtained, and then 99 frames of target detection information belonging to the same target are obtained according to 99 frames of target detection information corresponding to each pixel point belonging to the same target on the two-dimensional image;
depending on the nature of the radar detected target, the target and clutter differ in that the target occupies a certain number of consecutive range bins in range, rather than an isolated single point. Therefore, in step S32, if the pixel to be detected does not have 8 neighborhood pixels, that is, does not form the matrix f, edge detection is not performed on the pixel to be detected that does not have 8 neighborhood pixels, and edge detection is performed on a pixel adjacent to or continuous with the pixel to be detected, and if the pixel to be detected does not have adjacent or continuous pixels, that is, the pixel to be detected is an isolated pixel, the pixel to be detected is not a target pixel.
In step S4, the still target area determination process includes the steps of:
s41, comparing the direction of the target in 99 frames of target detection information belonging to the same target, finding out the direction alpha of the maximum target2And minimum orientation alpha of the target1,0°≤α1≤α2Not more than 90 degrees; and comparing the distance library of the target in the 99 frames of target detection information belonging to the same target to find out the largest distance library beta of the target2And the smallest distance library beta of objects1,β2、β1Denotes a distance bin number, 1. ltoreq. beta1≤β2Less than or equal to 2000; obtaining the azimuth alpha of the target with the smallest target detection information coverage range in azimuth1Starting to the maximum target orientation alpha2Ending, and obtaining the coverage range of the target detection information of the target on the distance library from the beta th1Starting from the range bin to the beta2The distance library is ended, and the area of the coverage area for obtaining the target detection information of the target is (alpha)21)×(β21);
S42, according to the number of the target amplitude values of the target in the 99 frames of target detection information belonging to the same target, which are not 0db, and the area (alpha) of the coverage area of the target detection information of the target21)×(β21) Calculating the regional quantization parameter of the target if the targetIf the area quantization parameter is greater than the threshold, the target is a static target, and the coverage area of the target detection information of the target is a static target area, in this embodiment, the value of the threshold is 28;
the calculation mode of the regional quantization parameter of the target is as follows:
region quantization parameter i/(α) of the target21)×(β21);
Where i represents the number of target amplitude values other than 0db in the target detection information of 99 frames belonging to the same target.
In step S5, the suppression process includes the steps of:
s51, judging the target detection information processed by the radar front end according to the static target area, if the direction of the target in the target detection information processed by the radar front end is located in the target detection information covering range of the static target in the direction, and the distance library of the target in the target detection information is located in the target detection information covering range of the static target in the distance library, the target detection information is the target detection information of the static target;
s52, in the data sent to the subsequent track processing or terminal display interface, the static target is marked, namely different identification positions are respectively added to the target detection information of the static target and the target detection information of the non-static target for distinguishing, so that a reliable data basis is provided for the subsequent track processing; or the target detection information of the static target is not output, namely the target detection information of the static target is not sent, only the target detection information of the non-static target is sent, so that the output number of the static targets is effectively reduced, the detection precision of the moving target is improved, and simultaneously the target detection information of the moving target is displayed on a display interface of the terminal; in this embodiment, in the data sent to the subsequent track processing or the terminal display interface, the still object is marked.
As shown in fig. 2 to 4, the test data is a data segment of target detection information obtained in real time and processed by the front end of the ground surveillance radar, and the x-axis of the coordinate axis is 64 directions, and the y-axis of the coordinate axis is 30 extracted distance bins. The invention uses the accumulated target detection information to form a two-dimensional image, and meanwhile, the two-dimensional image is continuously updated, so that the interpretation precision of the static target area is improved, and the risk of misjudgment of the static target area caused by accumulation along with time and changes of a detection scene and a detection position is reduced; the method comprises the steps of judging a static target area by using an area quantization parameter, and setting a threshold value of the corresponding area quantization parameter according to the characteristics and power of radar detection; as shown in fig. 4, the present invention marks a stationary target, and target pixel points belonging to the same target and not in the stationary target area are moving target tracks, thereby providing a reliable data base for subsequent track processing. The static target inhibition method improves the stability and reliability of the ground monitoring radar system, and improves the inhibition efficiency of the static target.
The invention is not to be considered as limited to the specific embodiments shown and described, but is to be understood to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the invention as defined by the appended claims.

Claims (4)

1. A stationary target suppression method for a ground surveillance radar system, comprising the steps of:
s1, acquiring target detection information processed by the radar front end;
s2, performing inter-frame accumulation processing on the target detection information, and forming a two-dimensional image from the accumulated target detection information;
s3, performing target edge extraction on the two-dimensional image to obtain extracted target detection information belonging to the same target;
s4, carrying out static target area judgment processing on target detection information belonging to the same target, and marking a static target area range;
s5, suppressing the target detection information in the static target area range to obtain the target detection information after static target suppression;
in step S1, the target detection information includes: the time stamp of the target, the amplitude value of the target, the distance library of the target and the azimuth of the target;
time stamp of the target: the time when the radar detected the target;
the distance library of the target: a distance library in which the distance of the target detected by the radar is located;
orientation of the target: azimuth angle of the target detected by the radar;
amplitude value of the target: an energy value of the target detected by the radar;
the distance library is a unit obtained by quantizing the detection distance range of the radar;
in the process of scanning a frame by the radar, the radar detects A azimuths, the range of the detection distance of the radar in each azimuth is quantized into B distance libraries, A is more than or equal to 32 and less than or equal to 64, and B is more than or equal to 1024 and less than or equal to 2000; each direction corresponds to B distance libraries, and each distance library corresponds to A directions;
in step S2, the inter-frame accumulation process includes the steps of:
s21, taking A directions detected by the radar as an x axis of a two-dimensional image, taking B distance libraries obtained by quantifying the range of the radar detection distance as a y axis of the two-dimensional image to form a coordinate axis of the two-dimensional image, wherein the two-dimensional image comprises A multiplied by B pixel points;
s22, accumulating the target detection information of the previous n frames scanned by the radar, wherein n is more than or equal to 50 and less than or equal to 100;
s23, acquiring target detection information of an n +1 th frame scanned by the radar, deleting the target detection information of the 1 st frame scanned by the radar, counting the target detection information of the n-1 frame from the start of the target detection information of the 2 nd frame scanned by the radar to the end of the target detection information of the n th frame scanned by the radar, averaging the amplitude values of the targets in the target detection information of the n-1 frame in a certain distance bank of a certain direction, and taking the average value as the pixel value of the pixel point in the corresponding distance bank of the corresponding direction on the two-dimensional image; sequentially averaging the amplitude values of the targets in the n-1 frames of target detection information on each distance library of each azimuth, sequentially obtaining the pixel value of each pixel point on the two-dimensional image to form a two-dimensional image, and marking an index on the two-dimensional image;
if the radar does not detect the target on a certain distance library of a certain azimuth, the amplitude value of the target in the target detection information on the corresponding distance library of the corresponding azimuth is 0 db;
the index mark is as follows: each pixel point on the two-dimensional image is marked with an index, and the marking content is the target detection information of the n-1 frame on a certain distance library of a certain direction corresponding to the pixel point on the two-dimensional image, including the amplitude value of the target in the target detection information of each frame in the n-1 frame and the frame number of the corresponding frame;
s24, acquiring target detection information of an n +2 th frame scanned by the radar, deleting the target detection information of a 2 nd frame scanned by the radar, counting the target detection information of an n-1 frame from the start of the target detection information of a 3 rd frame scanned by the radar to the end of the target detection information of the n +1 th frame scanned by the radar, sequentially re-averaging the amplitude values of the targets in the target detection information of the n-1 frame in each distance bank of each direction in the mode of the step S23, sequentially re-obtaining the pixel values of each pixel point on the two-dimensional image, updating the two-dimensional image, and marking an index on the two-dimensional image;
and S25, acquiring target detection information of the next frame scanned by the radar according to the mode of the step S24, obtaining the pixel value of each pixel point on the two-dimensional image again, updating the two-dimensional image, and marking an index on the updated two-dimensional image.
2. The stationary object suppressing method for the ground monitoring radar system as recited in claim 1, wherein in the step S3, the object edge extraction comprises the following steps:
s31, performing binarization processing on the two-dimensional image to obtain a binary image; the specific method of the binarization processing is as follows: sequentially judging whether the pixel value of each pixel point on the two-dimensional image is greater than or equal to 1db, if so, setting the pixel value of the corresponding pixel point on the two-dimensional image to 1 db; if not, setting the pixel value of the corresponding pixel point on the two-dimensional image to be 0 db;
s32, sequentially carrying out edge detection on the pixel points on the binary image by using a Sobel edge detection operator to obtain edge pixel points;
the specific method for edge detection is as follows: calculating to-be-detected pixel point Z5G, G ═ f × Gx|+|f*GyIf g is larger than or equal to the set edge estimation threshold, the pixel point Z to be detected5Are edge pixel points;
wherein, the f matrix is formed by pixel points Z to be detected5Pixel value and pixel point Z to be detected5The pixel values of the 8 neighborhood pixel points form a matrix;
Gxand GyTwo groups of 3 x 3 matrixes of Sobel edge detection operators respectively;
Figure FDA0002966918930000031
s33, the edge pixel points are pixel points on the outline belonging to the same target, the pixel points in the outline belonging to the same target are obtained according to the pixel points on the outline belonging to the same target, namely the pixel points on the outline and in the outline belonging to the same target are obtained according to the edge pixel points, the pixel points belonging to the same target are obtained, and then the target detection information of the n-1 frame belonging to the same target is obtained according to the target detection information of the n-1 frame corresponding to each pixel point belonging to the same target on the two-dimensional image;
in step S32, if the pixel to be detected does not have 8 neighborhood pixels, that is, does not form the matrix f, then edge detection is not performed on the pixel to be detected without 8 neighborhood pixels, and edge detection is performed on a pixel adjacent to or continuous with the pixel to be detected without 8 neighborhood pixels, if the pixel to be detected without 8 neighborhood pixels does not have adjacent or continuous pixels, that is, the pixel to be detected without 8 neighborhood pixels is an isolated pixel, then the pixel to be detected without 8 neighborhood pixels is not a target pixel.
3. The stationary target suppression method for a ground surveillance radar system according to claim 2, wherein in step S4, the stationary target area determination process includes:
s41, comparing the target direction in the target detection information of n-1 frames belonging to the same target, finding out the maximum target direction alpha2And minimum orientation alpha of the target1,α1、α2Are all within the range of the detection azimuth of the radar; and comparing the distance library of the target in the target detection information of the n-1 frames belonging to the same target to find out the distance library beta of the maximum target2And the smallest distance library beta of objects1,β1、β2Denotes a distance bin number, 1. ltoreq. beta1≤β2B is less than or equal to B, and the coverage range of the target detection information of the target in the azimuth is obtained from the azimuth alpha of the target with the minimum azimuth1Starting to the maximum target orientation alpha2And finishing, obtaining the distance library beta of the target with the smallest coverage range of the target detection information of the target on the distance library1Starting to maximum target range bin beta2Ending and obtaining the area of the coverage area of the target detection information of the target as (alpha)21)×(β21);
S42, according to the number of the amplitude values of the target not 0db in the target detection information of the n-1 frames belonging to the same target and the area (alpha) of the coverage area of the target detection information of the target21)×(β21) Calculating the regional quantization parameter of the target, wherein if the regional quantization parameter of the target is greater than the regional quantization parameter threshold, the target is a static target, and the coverage range of the target detection information of the target is a static target region;
the calculation mode of the regional quantization parameter of the target is as follows:
region quantization parameter i/(α) of the target21)×(β21);
Wherein i represents the number of values of the amplitude of the object in the object detection information of n-1 frames belonging to the same object, which are not 0 db.
4. The stationary object suppression method for the ground monitoring radar system as claimed in claim 3, wherein in step S5, the suppression process comprises the following steps:
s51, judging the target detection information processed by the radar front end according to the static target area, if the direction of the target in the target detection information processed by the radar front end is located in the target detection information covering range of the static target in the direction, and the distance library of the target in the target detection information is located in the target detection information covering range of the static target in the distance library, the target detection information is the target detection information of the static target;
s52, in the data sent to the subsequent track processing or terminal display interface, the static target is marked, namely the target detection information of the static target and the target detection information of the non-static target are respectively added with different identification positions for distinguishing; or the target detection information of the static target is not output, namely the target detection information of the static target is not sent, and only the target detection information of the non-static target is sent.
CN201811001833.4A 2018-08-30 2018-08-30 Static target suppression method for ground monitoring radar system Active CN109254271B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811001833.4A CN109254271B (en) 2018-08-30 2018-08-30 Static target suppression method for ground monitoring radar system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811001833.4A CN109254271B (en) 2018-08-30 2018-08-30 Static target suppression method for ground monitoring radar system

Publications (2)

Publication Number Publication Date
CN109254271A CN109254271A (en) 2019-01-22
CN109254271B true CN109254271B (en) 2021-05-04

Family

ID=65049755

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811001833.4A Active CN109254271B (en) 2018-08-30 2018-08-30 Static target suppression method for ground monitoring radar system

Country Status (1)

Country Link
CN (1) CN109254271B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020237448A1 (en) * 2019-05-24 2020-12-03 华为技术有限公司 Echo signal processing method, apparatus and system, and storage medium
CN110221286B (en) * 2019-05-30 2021-04-06 安徽四创电子股份有限公司 Self-adaptive target detection method based on ground monitoring radar
CN110765940B (en) * 2019-10-22 2022-12-30 杭州姿感科技有限公司 Target object statistical method and device
CN111047641A (en) * 2019-12-30 2020-04-21 上海眼控科技股份有限公司 Marking method, marking device, computer equipment and storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4016826B2 (en) * 2002-12-10 2007-12-05 株式会社デンソー Object labeling method and apparatus, program
CN104730496B (en) * 2015-03-17 2017-03-29 电子科技大学 A kind of Squint SAR ground moving target extracting method
CN106918807B (en) * 2017-02-28 2019-07-09 西安电子科技大学 A kind of Targets Dots condensing method of radar return data
CN107064902B (en) * 2017-05-12 2020-05-01 安徽四创电子股份有限公司 Target condensation method for airport scene surveillance radar system
CN107402381B (en) * 2017-07-11 2020-08-07 西北工业大学 Iterative self-adaptive multi-maneuvering target tracking method
CN107728115B (en) * 2017-09-11 2020-08-11 电子科技大学 SVM-based background interference suppression method after radar target imaging

Also Published As

Publication number Publication date
CN109254271A (en) 2019-01-22

Similar Documents

Publication Publication Date Title
CN109254271B (en) Static target suppression method for ground monitoring radar system
CN109375177B (en) Moving target detection method for airport scene surveillance radar system
US11443437B2 (en) Vibe-based three-dimensional sonar point cloud image segmentation method
CN107993245B (en) Aerospace background multi-target detection and tracking method
CN112669344B (en) Method and device for positioning moving object, electronic equipment and storage medium
CN112800860B (en) High-speed object scattering detection method and system with coordination of event camera and visual camera
CN111781600B (en) Vehicle queuing length detection method suitable for signalized intersection scene
CN109711256B (en) Low-altitude complex background unmanned aerial vehicle target detection method
CN102542289A (en) Pedestrian volume statistical method based on plurality of Gaussian counting models
CN108022249A (en) A kind of remote sensing video satellite moving vehicle target region of interest extraction method
CN110490904B (en) Weak and small target detection and tracking method
CN109685827B (en) Target detection and tracking method based on DSP
CN111462155B (en) Motion detection method, device, computer equipment and storage medium
CN111444778A (en) Lane line detection method
CN111179302B (en) Moving target detection method and device, storage medium and terminal equipment
CN107942329A (en) Motor platform single-channel SAR is to surface vessel object detection method
CN111582166A (en) Remnant detection method based on Gaussian modeling and YoLo V3 target detection
CN108983194B (en) Target extraction and condensation method based on ground monitoring radar system
Miller et al. Person tracking in UAV video
CN111881837B (en) Shadow extraction-based video SAR moving target detection method
CN107194954B (en) Player tracking method and device of multi-view video
CN108828549B (en) Target extraction method based on airport scene surveillance radar system
CN109544574B (en) Target extraction method based on all-solid-state VTS radar
CN109547777B (en) Method for rapidly detecting video noise of complex scene
CN106951831B (en) Pedestrian detection tracking method based on depth camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant