CN109917361B - Three-dimensional unknown scene imaging method based on bistatic radar - Google Patents

Three-dimensional unknown scene imaging method based on bistatic radar Download PDF

Info

Publication number
CN109917361B
CN109917361B CN201910259240.6A CN201910259240A CN109917361B CN 109917361 B CN109917361 B CN 109917361B CN 201910259240 A CN201910259240 A CN 201910259240A CN 109917361 B CN109917361 B CN 109917361B
Authority
CN
China
Prior art keywords
imaging
vector
scene
unknown
matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910259240.6A
Other languages
Chinese (zh)
Other versions
CN109917361A (en
Inventor
杨晓波
陈家辉
崔国龙
师贞鹏
李虎泉
郭世盛
张扬
孔令讲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN201910259240.6A priority Critical patent/CN109917361B/en
Publication of CN109917361A publication Critical patent/CN109917361A/en
Application granted granted Critical
Publication of CN109917361B publication Critical patent/CN109917361B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Radar Systems Or Details Thereof (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)

Abstract

The invention discloses a three-dimensional unknown scene imaging method based on a bistatic radar, relates to radar imaging technology, and particularly relates to three-dimensional unknown scene imaging technology based on the bistatic radar. The unknown region is first scanned in multiple positions and angles based on the bistatic radar, so that the measured data can contain all information of the scene. The scanning scheme only records the attenuation value of the received signal, so that the scanning scheme is also suitable for a narrow-band radar, and the system cost can be effectively reduced. And then performing algebraic iterative reconstruction, positive constraint, total variation minimization constraint and median filtering operation on the measurement vector through a TV-MF-ART sparse reconstruction algorithm provided by the patent, and finally obtaining a high-precision three-dimensional scene image. Therefore, the invention has the advantages of high scanning speed, small calculated amount and high reconstruction precision.

Description

Three-dimensional unknown scene imaging method based on bistatic radar
Technical Field
The invention relates to radar imaging technology, in particular to three-dimensional unknown scene imaging technology based on bistatic radar.
Background
The unknown scene imaging technology is a technology of transmitting electromagnetic wave signals of a specific frequency band through a transmitting antenna, receiving echo signals or transmission signals of a scene by a receiving antenna, and enabling a receiving and transmitting antenna to move to finish multi-position and multi-view detection, so that echoes of all scenes are obtained, and an unknown scene panoramic image containing all scene images is formed. The technology can be used for acquiring complete images of unknown scenes, provides priori information for accurate target positioning and multipath inhibition, and plays an important role in the fields of urban perception, disaster relief and the like.
Conventional unknown scene imaging techniques are typically based on MIMO radar or SAR to acquire echo data. However, in order to obtain accurate position information of the scene object, the above detection means needs accurate phase information and large signal bandwidth to be guaranteed, which leads to high system complexity and hardware cost. In addition, this approach uses echo signals, and when the scene environment is complex, the internal multipath signals will dominate, thereby affecting the imaging quality. Therefore, it is urgent to find a new imaging scheme.
In recent years, many domestic and foreign research institutions try to apply the medical computed tomography imaging theory (CT) based on X-rays to the microwave frequency band, and related theoretical and technical research works have also been carried out. Karanam et al realized three-dimensional unknown scene imaging based on Wi-Fi received signal strength values (RSSI) and TVAL3 algorithm according to the basic theory that electromagnetic waves attenuate differently in different media and different thicknesses (C.R. Karanam and Y. Mostoni, "3D Through-Wall Imaging with Unmanned Aerial Vehicles Using Wi-Fi," in 2017 16th ACM/IEEE International Conference on Information Processing in Sensor Networks (IPSN), april 2017, pp.131-142). Fhager A et al simulate and verify microwave time delay spectrum measurement based on linear frequency modulation signals by an FDTD method, and reconstruct a two-dimensional unknown object by adopting a transceiving split synchronous scanning mode (Fhager A, persson M.Comparison of two image reconstruction algorithms for microwave tomography [ J ]. Radio Science,2005,40 (3): 1-15.). However, while the above discussed solution enables imaging of unknown areas, objects, the transmit-receive split synchronous scanning solution is inefficient, which is prohibitive for imaging large scenes, and the imaging quality is rough. Therefore, how to improve the efficiency of radar to acquire the echo of the detection area and how to obtain the high-quality unknown scene image still needs a great deal of work to be verified.
Disclosure of Invention
Aiming at the defects of low receiving and transmitting synchronous scanning rate, poor imaging effect and the like in the prior unknown scene imaging technology, the invention provides a novel unknown scene imaging scheme based on a narrow-band narrow-beam double-base radar, which can effectively improve the efficiency when the inside of a 'perception' area is provided, and can obtain high-quality scene images. Firstly, a ground-air joint scanning mode based on a ground radar and an unmanned aerial vehicle is designed to acquire data of an unknown scene, and the mode has the advantages of high execution efficiency and strong environmental adaptability; then, establishing a connection between the received signal and the unknown scene by using a WKB approximation method; finally, the total variation minimization algebraic iterative algorithm (TV-MF-ART) under the median filtering provided by the invention is inverted to obtain a high-precision scene imaging result.
The technical scheme of the invention is that the three-dimensional unknown scene imaging method based on the bistatic radar comprises the following steps:
step 1: acquiring scene information
For unknown regions
Figure BDA0002014739480000021
Scanning to make a narrow-beam radar emit narrow-band continuous wave signal outside of unknown scene, and on the other side of the scene N is mounted a The unmanned aerial vehicle with the receiving array elements moves according to a prescribed route, receives the transmission signals after penetrating through the unknown area, and sends out during the movement of the unmanned aerial vehicleThe transmitting antenna is guaranteed to face the corresponding receiving array element in the measuring time, and the attenuation of the received signal is guaranteed to be caused by the fact that direct waves penetrate through a scene;
let n p Indicating the position of the unmanned plane, n v Is the viewing angle; then n= (n a ,n p ,n v ) Represented as being located at n v N under view angle p Nth at position a The position vectors of the array elements are received; when n=n is completed a ×N p ×N v N at the time of measurement a Represents the total number of received array elements, N p Represents the number of unmanned aerial vehicle movements at a fixed viewing angle, N v Representing the number of views, the measurement matrix is expressed as:
P=[p 1 ,p 2 ,...,p N ] T (1)
wherein p is n Representing the received power at the n-position.
The measured value is the attenuation of the signal power, which can be different according to the size, the position and the dielectric constant of different dielectrics, and according to the Wentzel-Kramers-Brillouin approximation, the relation between the measured attenuation and the propagation path can be expressed as:
σ n ∝exp(j2πf cT→R α(r n )dr) (2)
wherein f c Represents the center frequency, sigma n For attenuation value, alpha n (r) is the attenuation rate of the electromagnetic wave at the position r, +. T→R The line integration between the receiving and transmitting array elements; dispersing the imaging area into M units, and recording an imaging matrix O; the value at each element depends on the rate of decay of the electromagnetic wave at that location, then the imaging matrix is represented as a set of different rates of decay, i.e., o= [ α (r) 1 ),α(r 2 ),...,α(r M )]Thus, the link between the measurement vector and the imaging vector is:
P=A·O+b (3)
wherein A is E R N×M For a mapping matrix, representing a mapping between the imaging vector and the measurement vector; when j is th The units being located at i th A (i, j) =1 for the secondary measurement, otherwise 0; b is the measurement error including the positioning error andenvironmental noise; thus, step 1 is completed to acquire unknown scene information;
step 2: imaging unknown scenes
Step 1 is executed to obtain a measurement vector containing unknown scene information; because the measured data is far less than the imaging units, and scene imaging is performed under sparse sampling data according to the measurement vector;
step 2-1: algebraic iterative reconstruction
Regarding the underdetermined equation set formula (3) as N hyperplanes, firstly setting an initial solution, and orthogonally projecting the initial solution onto each hyperplane in sequence to realize iterative updating of the initial solution and approach to a real solution; the iterative equation is:
Figure BDA0002014739480000031
wherein λ is a convergence factor, which can be set to 1,O q Representing the imaging vector at the q-th iteration, A q,+ The q-th row vector, N, expressed as a mapping matrix ART For measuring the number of points, setting the initial imaging vector to be 0, and obtaining an initial imaging vector through iteration of the step;
step 2-2: performing positive constraint on the initial imaging vector obtained in the step 2-1;
setting the negative number in the initial imaging vector of the step 2-1 to 0; the expression is:
O=max(O,0) (5)
step 2-3: total variation minimization step
And (3) performing total variation minimization iteration on the imaging matrix constrained in the step (2-2), wherein a specific iteration equation is as follows:
Figure BDA0002014739480000032
wherein: alpha is an artificially set convergence factor, N TV For the number of iteration rounds, ΔO is the difference between the imaging matrices of step 2-1 and step 2-2, |·|| 2 Is matrix two norms, I I.I.I TV Ladder for imageThe gradient calculation formula of the three-dimensional image is as follows:
Figure BDA0002014739480000041
wherein: o (O) i,j,k For a three-dimensional imaging matrix, i, j and k are row and column high indexes, ρ is a small positive number, and denominator is prevented from being 0;
step 2-4: three-dimensional median filtering step
And converting the imaging vector with the minimized total variation into a three-dimensional matrix, and executing three-dimensional median filtering operation, wherein the specific expression is as follows:
Figure BDA0002014739480000042
wherein the method comprises the steps of
Figure BDA0002014739480000043
For cyclic convolution, W is a sliding window, the window size depends on the number of imaging matrix elements; and obtaining a final scene imaging result after the step.
The invention provides a scene imaging scheme suitable for receiving and transmitting a split radar, and has the advantages of high scanning speed and high imaging precision. The unknown region is first scanned in multiple positions and angles based on the bistatic radar, so that the measured data can contain all information of the scene. The scanning scheme only records the attenuation value of the received signal, so that the scanning scheme is also suitable for a narrow-band radar, and the system cost can be effectively reduced. And then performing algebraic iterative reconstruction, positive constraint, total variation minimization constraint and median filtering operation on the measurement vector through a TV-MF-ART sparse reconstruction algorithm provided by the patent, and finally obtaining a high-precision three-dimensional scene image. Therefore, the invention has the advantages of high scanning speed, small calculated amount and high reconstruction precision.
Drawings
FIG. 1 is a schematic diagram of a bistatic radar scan of the present invention;
FIG. 2 is a block diagram of a reconstruction algorithm;
FIG. 3 is a schematic representation of three-dimensional median filtering;
FIG. 4 is a scene to be reconstructed;
FIG. 5 is an example of a bistatic radar scanning scenario;
FIG. 6 is a graph showing the result of a conventional ART algorithm reconstruction;
FIG. 7 is a graph showing the result of the reconstruction algorithm proposed by the present patent;
fig. 8 is a comparison of measurement curves under RADON transformation.
Detailed Description
The following describes the steps of the invention in connection with a simulation.
The building is shown in fig. 4, the unknown scene area is 3m×3m×1m. Dividing the region into grid cells of 0.01mX0.01mX0.01m, that is to say the unknown region shares 9X 10 6 A unit. And a foundation radar and an unmanned aerial vehicle are arranged outside the area to scan the area, the signal emitted by the foundation radar is a sine wave signal of 2Ghz, and the unmanned aerial vehicle is provided with a linear array of 100 equally-spaced array elements and is used for receiving the signal after a transmission scene and recording the attenuation value of power.
Step 1: and enabling the unmanned aerial vehicle to move at the other side of the scene, and recording a signal attenuation value once when each receiving array element moves at an interval of 0.01m, wherein the recording times are 300 times. And finishing one visual angle measurement after recording. The measured values recorded in this step include the attenuation of the direct path, the multipath component when penetrating the wall, and the ambient noise. In order to restrain the influence of multipath phenomenon on the model, the narrow beam antenna adopted by the method can ensure that the received signal is dominated by direct waves while increasing the transmitting power.
Step 2: in order that the scene information obtained is not redundant, i.e. the angle of opening at the time of measurement is sufficiently large. Thus, the measurement of the angle of multiple views is accomplished. The selection of different viewing angles can affect the imaging result. The simulation visual angles are 4, namely 0 degree, 45 degree, 90 degree and 135 degree respectively. After this round of measurement is completed, 1200 measurement points are obtained in total and are marked as P. The position of each measuring point is recorded, and a mapping matrix A is calculated according to the positions, and the dimension is 1200 multiplied by 9000000.
Step 3: performing algebraic iterative reconstruction step, the initial imaging vector being set toAll 0, iteration times are 1200, and an imaging matrix O is obtained through iteration according to a formula (4) ART
Step 4: performing positive constraint (5) to obtain O POS And calculating the difference delta O of the imaging matrixes in the step 3 and the step 4.
Step 5: and constraining the total variation according to the formula (6) and the formula (7), wherein the iteration times are 20, and the total energy of the image is reduced by a gradient descent method. The obtained result is recorded as O MF
Step 6: and executing a median filtering operation formula (8) to further smooth the image.
Step 7: if the image does not converge (convergence determination indicates that the total value of Δo is smaller than a certain threshold value), the process continues to step 3-6. And obtaining a final unknown scene image after convergence.
Those of ordinary skill in the art will recognize that the embodiments described herein are for the purpose of aiding the reader in understanding the principles of the present invention and should be understood that the scope of the invention is not limited to such specific statements and embodiments. Various modifications and variations of the present invention will be apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the scope of the claims of the present invention.

Claims (1)

1. A three-dimensional unknown scene imaging method based on a bistatic radar, the method comprising:
step 1: acquiring scene information
For unknown regions
Figure FDA0004064909340000011
Scanning to make a narrow-beam radar emit narrow-band continuous wave signal outside of unknown scene, and on the other side of the scene N is mounted a The unmanned aerial vehicle with the receiving array elements moves according to a prescribed route, and receives the transmission signals after penetrating through an unknown area, and during the movement of the unmanned aerial vehicle, the transmitting antenna is ensured to face the corresponding receiving array elements in the measurement time, so that the attenuation of the receiving signals is caused by the penetration of direct waves;
let n p Indicating the position of the unmanned plane, n v Is the viewing angle; then n= (n a ,n p ,n v ) Represented as being located at n v N under view angle p Nth at position a The position vectors of the array elements are received; when n=n is completed a ×N p ×N v N at the time of measurement a Represents the total number of received array elements, N p Represents the number of unmanned aerial vehicle movements at a fixed viewing angle, N v Representing the number of views, the measurement matrix is expressed as:
P=[p 1 ,p 2 ,…p n ...,p N ] T (1)
wherein p is n Representing the received power at the n-position;
the measured value is the attenuation of the signal power, which varies with the size, position and dielectric constant of different dielectrics, and the relationship between the measured attenuation and the propagation path is expressed as follows according to the Wentzel-Kramers-Brillouin approximation:
σ n ∝exp(j2πf cT→R α(r n )dr) (2)
wherein f c Represents the center frequency, sigma n For attenuation values, α (r n ) For electromagnetic waves at position r n Attenuation Rate at ≡ T→R The line integration between the receiving and transmitting array elements; dispersing the imaging area into M units, and recording an imaging matrix O; the value at each element depends on the rate of decay of the electromagnetic wave at that location, then the imaging matrix is represented as a set of different rates of decay, i.e., o= [ α (r) 1 ),α(r 2 ),...,α(r M )]Thus, the link between the measurement vector and the imaging vector is:
P=A·O+b (3)
wherein A is E R N×M For a mapping matrix, representing a mapping between the imaging vector and the measurement vector; when j is th The units being located at i th A (i, j) =1 for the secondary measurement, otherwise 0; b is measurement error, including positioning error and ambient noise; thus, step 1 is completed to acquire unknown scene information;
step 2: imaging unknown scenes
Step 1 is executed to obtain a measurement vector containing unknown scene information; because the measured data is far less than the imaging units, imaging the scene under sparse sampling data according to the measurement vector;
step 2-1: algebraic iterative reconstruction
Regarding the underdetermined equation set formula (3) as N hyperplanes, firstly setting an initial solution, and orthogonally projecting the initial solution onto each hyperplane in sequence to realize iterative updating of the initial solution and approach to a real solution; the iterative equation is:
Figure FDA0004064909340000021
wherein lambda is a convergence factor, O q Representing the imaging vector at the q-th iteration, A q,+ The q-th row vector, N, expressed as a mapping matrix ART For measuring the number of points, setting the initial imaging vector to be 0, and obtaining an initial imaging vector through iteration of the step;
step 2-2: performing positive constraint on the initial imaging vector obtained in the step 2-1;
setting the negative number in the initial imaging vector of the step 2-1 to 0; the expression is:
O=max(O,0) (5)
step 2-3: total variation minimization step
And (3) performing total variation minimization iteration on the imaging matrix constrained in the step (2-2), wherein a specific iteration equation is as follows:
Figure FDA0004064909340000022
wherein: alpha is an artificially set convergence factor, N TV For the number of iteration rounds, ΔO is the difference between the imaging matrices of step 2-1 and step 2-2, |·|| 2 Is matrix two norms, I I.I.I TV For the gradient of the image, the gradient calculation formula of the three-dimensional image is as follows:
Figure FDA0004064909340000023
wherein: o (O) i,j,k For a three-dimensional imaging matrix, i, j and k are row and column high indexes, ρ is a small positive number, and denominator is prevented from being 0;
step 2-4: three-dimensional median filtering step
And converting the imaging vector with the minimized total variation into a three-dimensional matrix, and executing three-dimensional median filtering operation, wherein the specific expression is as follows:
Figure FDA0004064909340000031
wherein the method comprises the steps of
Figure FDA0004064909340000032
For cyclic convolution, W is a sliding window, the window size depends on the number of imaging matrix elements; and obtaining a final scene imaging result after the step. />
CN201910259240.6A 2019-04-02 2019-04-02 Three-dimensional unknown scene imaging method based on bistatic radar Active CN109917361B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910259240.6A CN109917361B (en) 2019-04-02 2019-04-02 Three-dimensional unknown scene imaging method based on bistatic radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910259240.6A CN109917361B (en) 2019-04-02 2019-04-02 Three-dimensional unknown scene imaging method based on bistatic radar

Publications (2)

Publication Number Publication Date
CN109917361A CN109917361A (en) 2019-06-21
CN109917361B true CN109917361B (en) 2023-04-25

Family

ID=66968099

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910259240.6A Active CN109917361B (en) 2019-04-02 2019-04-02 Three-dimensional unknown scene imaging method based on bistatic radar

Country Status (1)

Country Link
CN (1) CN109917361B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110928326A (en) * 2019-11-26 2020-03-27 南京航空航天大学 Measuring point difference planning method for aircraft appearance
CN111175740A (en) * 2020-01-09 2020-05-19 电子科技大学 Building layout reconstruction optimization method based on through-wall radar
CN113075738A (en) * 2021-03-26 2021-07-06 桂林理工大学 Ground penetrating radar measurement system based on unmanned aerial vehicle
CN113406637B (en) * 2021-06-23 2022-11-01 电子科技大学 Joint iterative tomography method based on dual-frequency narrow-band signals
CN113640798B (en) * 2021-08-11 2023-10-31 北京无线电测量研究所 Multi-angle reconstruction method, device and storage medium for radar target
CN116087235B (en) * 2023-04-07 2023-06-20 四川川交路桥有限责任公司 Multi-source coupling bridge damage detection method and system

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103149554A (en) * 2013-02-02 2013-06-12 西安电子科技大学 Scaling inverse Fourier transformation imaging method of bistatic synthetic aperture radar (SAR)
CN104240210A (en) * 2014-07-21 2014-12-24 南京邮电大学 CT image iteration reconstruction method based on compressed sensing
CN105796121A (en) * 2016-03-02 2016-07-27 中国人民解放军第四军医大学 CT and X-ray luminescence computed dual-mode synchronous tomography method
CN105954745A (en) * 2016-04-29 2016-09-21 电子科技大学 Imaging method suitable for through-wall radar multipath phantom inhibition
CN106772365A (en) * 2016-11-25 2017-05-31 南京理工大学 A kind of multipath based on Bayes's compressed sensing utilizes through-wall radar imaging method
CN107942326A (en) * 2017-11-14 2018-04-20 西南交通大学 A kind of two-dimentional active MMW imaging method with high universalizable
CN108872980A (en) * 2018-06-19 2018-11-23 电子科技大学 A kind of adaptive through-wall imaging method based on narrowband systems
CN109194959A (en) * 2018-09-28 2019-01-11 中国科学院长春光学精密机械与物理研究所 A kind of compressed sensing imaging method, device, equipment, system and storage medium
CN109358328A (en) * 2018-11-06 2019-02-19 电子科技大学 The polar coordinates format image-forming method of the bistatic Forward-looking SAR of motor platform

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9971019B2 (en) * 2013-07-22 2018-05-15 Mitsubishi Electric Research Laboratories, Inc. System and method for through-the-wall-radar-imaging using total-variation denoising
US9261592B2 (en) * 2014-01-13 2016-02-16 Mitsubishi Electric Research Laboratories, Inc. Method and system for through-the-wall imaging using compressive sensing and MIMO antenna arrays
US10042046B2 (en) * 2015-07-07 2018-08-07 Mitsubishi Electric Research Laboratories, Inc. System and method for radar imaging using distributed arrays and compressive sensing

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103149554A (en) * 2013-02-02 2013-06-12 西安电子科技大学 Scaling inverse Fourier transformation imaging method of bistatic synthetic aperture radar (SAR)
CN104240210A (en) * 2014-07-21 2014-12-24 南京邮电大学 CT image iteration reconstruction method based on compressed sensing
CN105796121A (en) * 2016-03-02 2016-07-27 中国人民解放军第四军医大学 CT and X-ray luminescence computed dual-mode synchronous tomography method
CN105954745A (en) * 2016-04-29 2016-09-21 电子科技大学 Imaging method suitable for through-wall radar multipath phantom inhibition
CN106772365A (en) * 2016-11-25 2017-05-31 南京理工大学 A kind of multipath based on Bayes's compressed sensing utilizes through-wall radar imaging method
CN107942326A (en) * 2017-11-14 2018-04-20 西南交通大学 A kind of two-dimentional active MMW imaging method with high universalizable
CN108872980A (en) * 2018-06-19 2018-11-23 电子科技大学 A kind of adaptive through-wall imaging method based on narrowband systems
CN109194959A (en) * 2018-09-28 2019-01-11 中国科学院长春光学精密机械与物理研究所 A kind of compressed sensing imaging method, device, equipment, system and storage medium
CN109358328A (en) * 2018-11-06 2019-02-19 电子科技大学 The polar coordinates format image-forming method of the bistatic Forward-looking SAR of motor platform

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Narrow-Band Through-Wall Imaging with Received Signal Strength Data;Lingxiao Cao 等;《21st International Conference on Information Fusion (FUSION)》;20180906;第1274-1279页 *
Non-local total variation regularization models for image restoration;P.Jidesh 等;《Computers and Electrical Engineering》;20180430;第67卷;第114-133页 *
基于层析成像和压缩感知的火焰三维重构研究;王馥瑶;《中国优秀硕士学位论文全文数据库 信息科技辑》;20180515(第05(2018)期);第I138-484页 *
基于改进的3D-FFBP三维穿墙成像研究;赵翔;《中国优秀硕士学位论文全文数据库 信息科技辑》;20190215(第02(2019)期);第I136-1334页 *
结合TV约束的穿墙雷达扩展目标成像方法;张燕 等;《雷达科学与技术》;20170630;第15卷(第3期);第229-235页 *

Also Published As

Publication number Publication date
CN109917361A (en) 2019-06-21

Similar Documents

Publication Publication Date Title
CN109917361B (en) Three-dimensional unknown scene imaging method based on bistatic radar
US8665132B2 (en) System and method for iterative fourier side lobe reduction
CN103713288B (en) Sparse Bayesian reconstruct linear array SAR formation method is minimized based on iteration
CN105917249B (en) Method and system for reconstructing the scene after wall
US8193967B2 (en) Method and system for forming very low noise imagery using pixel classification
CN107037429A (en) Linear array SAR three-D imaging methods based on thresholded gradient tracing algorithm
CN111142105A (en) ISAR imaging method for complex moving target
CN110632594B (en) Long-wavelength spaceborne SAR imaging method
CN108983208B (en) Target RCS measurement method based on near-field sparse imaging extrapolation
CN106918810B (en) A kind of microwave relevance imaging method when the amplitude phase error there are array element
CN110687528B (en) Adaptive beam former generation method and system
CN110208796B (en) Scanning radar super-resolution imaging method based on singular value inverse filtering
CN108872980B (en) Self-adaptive through-wall imaging method based on narrow-band system
CN106950565A (en) Space-borne SAR Imaging jitter compensation method, imaging method
Yarlequé et al. FMCW GPR radar mounted in a mini-UAV for archaeological applications: First analytical and measurement results
CN107589421B (en) Array foresight SAR imaging method
CN110879391B (en) Radar image data set manufacturing method based on electromagnetic simulation and missile-borne echo simulation
CN112859074A (en) Multi-band multi-view ISAR fusion imaging method
CN110596707B (en) MIMO radar three-dimensional imaging method based on multi-snapshot image combination
CN108845318B (en) Satellite-borne high-resolution wide-range imaging method based on Relax algorithm
CN111665501B (en) MIMO radar two-dimensional imaging method based on improved CBP
CN116930963A (en) Through-wall imaging method based on wireless communication system
CN113238229B (en) GeO satellite-machine bistatic SAR (synthetic aperture radar) non-fuzzy imaging method
CN116520318A (en) Millimeter wave imaging real-time calibration method, device, computer equipment and storage medium
CN110658502A (en) Amplitude-phase error correction method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant