CN114740476A - Video noise suppression method based on space-time trilateral filter - Google Patents

Video noise suppression method based on space-time trilateral filter Download PDF

Info

Publication number
CN114740476A
CN114740476A CN202210378103.6A CN202210378103A CN114740476A CN 114740476 A CN114740476 A CN 114740476A CN 202210378103 A CN202210378103 A CN 202210378103A CN 114740476 A CN114740476 A CN 114740476A
Authority
CN
China
Prior art keywords
frame
target frame
image
video
window
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210378103.6A
Other languages
Chinese (zh)
Other versions
CN114740476B (en
Inventor
艾加秋
王港
范高伟
姚佰栋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei University of Technology
Original Assignee
Hefei University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei University of Technology filed Critical Hefei University of Technology
Priority to CN202210378103.6A priority Critical patent/CN114740476B/en
Publication of CN114740476A publication Critical patent/CN114740476A/en
Application granted granted Critical
Publication of CN114740476B publication Critical patent/CN114740476B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • G01S13/90Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • G01S13/90Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
    • G01S13/9021SAR image post-processing techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a video noise suppression method based on a space-time trilateral filter, which comprises the following steps: 1. the method comprises the steps of performing frame splitting on a video, and determining a target frame to be filtered and adjacent frames; 2. setting a sliding window for a target frame, calculating the similarity of gray values in windows corresponding to the target frame and adjacent frames, and adaptively selecting a certain number of adjacent frames to participate in filtering of the target frame by setting a threshold; 3. selecting a Gaussian function to describe time information between adjacent frames which are adaptively selected by each window of a target frame; 4. and 5, carrying out trilateral filtering on the target frame image by utilizing the video frame image time space information to obtain a denoised image. The method can improve the inhibition capability on video noise, can well keep important details in the image, and has good practical application value.

Description

Video noise suppression method based on space-time trilateral filter
Technical Field
The invention relates to the technical field of video SAR noise suppression, in particular to a method for suppressing video noise by using a trilateral filter based on space-time information.
Background
Video Synthetic Aperture Radar (VSAR) is a new all-time and all-weather microwave detection technology, and combines SAR imaging and Video technologies to continuously monitor a specific area, so that a high-resolution SAR Video can be provided. However, due to speckle noise inherent in the video SAR frame images, it is difficult for further image interpretation and subsequent applications to obtain usable information. Therefore, the suppression of VSAR speckle noise has attracted great attention and has become a research hotspot.
Synthetic Aperture Radar (SAR) images are affected by speckle noise in many applications. And the video SAR imaging frame rate is high, and the image frames contain a large amount of redundant information related to time. Research shows that the time redundant information in the video SAR frame image is very helpful for restraining speckle noise. Traditional filtering methods (Lee filters, Frost filters, Kuan filters, modified sigma filters, NLM filters, BM3D filters, etc.) do not perform well in multiplicative speckle noise removal for video SAR images. Recent filtering methods are: li et al apply the enhanced VBM3D algorithm (by using time information for multi-frame averaging, then filtering the averaged image using the VBM3D algorithm) to denoise the video SAR image; the Huang school et al trains a denoising network to denoise the video SAR image through unsupervised learning. The methods screen and remove the time redundant information between adjacent frames of the video SAR image to a certain extent for filtering, and realize the improvement of the image denoising effect.
When the existing filtering method is used for smoothing the noise of the video SAR image, the local detail features of the target in the SAR image are easily blurred or lost. The traditional bilateral filter comprehensively applies the similarity of the gray value and the geometric space of an image, can effectively smooth Gaussian noise and simultaneously reserve the details of an optical image, and is widely applied to denoising of the optical image, however, due to the defects of the SAR image imaging principle, the bilateral filter cannot be directly applied to the speckle noise removal work of the video SAR image.
Disclosure of Invention
The invention aims to overcome the defects of the prior filtering technology, and provides a video noise suppression method based on a space-time trilateral filter, so that speckle noise in a video SAR image can be effectively suppressed, and texture details of the image are retained, thereby improving the suppression capability of the video noise.
In order to achieve the purpose, the invention adopts the following technical scheme:
the invention relates to a video noise suppression method based on a space-time trilateral filter, which is characterized by comprising the following steps:
step 1: acquiring a section of synthetic aperture radar video and splitting the video into N frames of images to obtain a video frame image set X ═ X1,X2,…,Xi,…,XN},XiRepresents the ith frame image, i ∈ [1, N];
Step 2, using the ith frame image XiDefining the subscript of the frame number before the target frame as L1And initializes L1I-1, defining a frame number index L after the target frame2And initializes L2=i+1;
Selecting a window with dimension a multiplied by a from the target frame;
step 3, calculating the gray value and the Lth value of the target frame in the window1Euclidean distance of gray values of frame images in the initial window
Figure BDA0003591016060000021
Step 4, if
Figure BDA0003591016060000022
Then L is1-1 assignment L1Then, returning to the step 3 for execution, otherwise, executing the step 5; wherein T is a threshold used to measure similarity;
step 5, calculating the gray value and the Lth of the target frame in the window2Frame images within said windowEuclidean distance of gray values
Figure BDA0003591016060000023
Step 6, if
Figure BDA0003591016060000024
Then L is2+1 assignment L2Then, returning to the step 5 for execution, otherwise, executing the step 7;
step 7, obtaining the subscript L of the frame number finally1And L2Determining a neighboring frame range [ L ] similar to the target frame initial window1,L2];
Step 8, traversing the target frame by using a window, and obtaining adjacent frame ranges corresponding to the target frame under different windows according to the processes of the step 3 to the step 7;
step 9, establishing a time weight Gaussian kernel function w for representing time information by using the formula (1)t(i,n):
Figure BDA0003591016060000025
In the formula (1), i is the subscript of the target frame, n is the subscript of the adjacent frame, and n belongs to [ L ∈ [ [ L ]1,L2],σtIs a time similarity diffusion factor;
step 10, geometric space weight function w based on bilateral filterd(. DEG) and a gray scale spatial weight function wr(. DEG.), and a time weighted Gaussian kernel function wt(i, n), constructing a trilateral filtering kernel function w using equation (2):
Figure BDA0003591016060000026
in the formula (2), xn,xn,0Respectively represent adjacent frame ranges L1,L2]Neighborhood pixel point and center pixel point of the nth frame image in the frame I (x)n),I(xn,0) Respectively representing neighborhood pixel points x of the nth frame imagenAnd a central pixel point xn,0Gray value of [ omega ]nCentral pixel point x representing nth frame imagen,0A neighborhood pixel point set of (2);
step 11, utilizing the formula (3) to carry out the adjacent frame range [ L1,L2]All image frames are fused and normalized to obtain adjacent frame range L1,L2]Fusion target frame I' (x) under the corresponding window:
Figure BDA0003591016060000031
in the formula (3), x represents the adjacent frame range [ L ]1,L2]Any pixel point of the fusion target frame I' (x) under the corresponding window;
step 12, performing trilateral filtering on the updated target frame I' (x) by using the formula (4) to obtain an adjacent frame range [ L ]1,L2]De-noised target frame under corresponding window
Figure BDA0003591016060000032
Thereby obtaining the denoised target frames under different windows and forming a denoised image
Figure BDA0003591016060000033
Figure BDA0003591016060000034
In the formula (4), x' represents the adjacent frame range [ L ]1,L2]De-noised target frame under corresponding window
Figure BDA0003591016060000035
Central pixel point of (3), omega0To update the domain pixel set of the center pixel x 'of the target frame I' (x).
Compared with the prior art, the invention has the beneficial effects that:
1. according to the method, the similarity between the video frame images is compared, so that the time information between adjacent frames of the video SAR images is effectively explored and utilized, a large amount of speckle noise in the video SAR images can be well smoothed, and meanwhile, edge and texture details are kept.
2. Before trilateral filtering is carried out on an image, a reasonable threshold value is set, the similarity of a target frame and adjacent frames is calculated to realize self-adaptive frame selection of each filtering window of the target frame, the time weight of the adjacent frames to each filtering window of the target frame is determined according to the selected frame number, therefore, the similarity information between frames is effectively utilized and participates in the construction of a trilateral filter weight kernel in the form of the time weight, the defect that speckle noise in an SAR image cannot be effectively removed by a traditional bilateral filter is overcome, and the noise of the target frame image is effectively smoothed by utilizing the time-space information of video SAR data.
3. The method skillfully applies the time-space information of the adjacent frame similar to the target frame, can obviously improve the speckle noise smoothing capacity of the video SAR image, and simultaneously keeps the texture details of the image as much as possible, thereby providing traversal for the interpretation and further application (terrain classification, target detection and the like) of the video SAR data.
4. The trilateral filter of the present invention is designed to smooth speckle noise that is heavily distributed in video SAR images. A trilateral (dimensional) coefficient weighting model is designed, time information between adjacent frames in an SAR video and geometric and gray level similarity information of pixels in the frames can be considered at the same time, and therefore speckle noise is effectively smoothed, and image details are well maintained. The proposed trilateral filter designs an adaptive frame selection strategy to select adjacent frame images similar to a target frame, and adaptively selects the frame number participating in speckle noise filtering of the target frame by judging the similarity of the adjacent frame and the target frame, so that the speckle noise of the video SAR image is smoothed, and simultaneously the edge and texture information of the image is effectively maintained.
5. Compared with the traditional bilateral filter, the trilateral filtering method provided by the invention combines time correlation information between adjacent frames, fully discovers and utilizes time-space information of the video SAR image, can obtain a better speckle noise removing effect by smoothing speckle noise, and can well keep the edge and texture details of the image.
Drawings
FIG. 1 is a flow chart of a spatio-temporal trilateration filter-based video noise suppression method of the present invention;
FIG. 2 is a trilateral (dimensional) weight kernel function composition diagram of the present invention;
FIG. 3 is a diagram illustrating the quantitative evaluation of the speckle noise removal effect of a video SAR image under different threshold values T by the spatiotemporal information-based trilateral filtering method of the present invention;
FIG. 4 is a comparison graph of the filtering results of the trilateral filtering method based on spatio-temporal information and other common SAR image filtering algorithms (bilateral filtering, VBM3D and ATS-RBF algorithm) on the 8 th frame image of the experimental video SAR.
Detailed Description
The present invention is further illustrated by the following examples, which include, but are not limited to, the following examples.
In this embodiment, as shown in fig. 1, a method for suppressing video noise based on a spatio-temporal trilateration filter includes the following steps:
step 1, selecting a section of Synthetic Aperture Radar (SAR) video as an experimental object, splitting the video into N frames, and obtaining a video frame image set X ═ X1,X2,…,Xi,…,XN},XiRepresents the ith frame image, i belongs to [1, N ]];
Step 2, using the ith frame image XiDefining the subscript of the frame number before the target frame as L1And initializes L1I-1, defining a frame number index L after the target frame2And initializes L2=i+1;
Selecting a window with dimension a multiplied by a from a target frame;
step 3, calculating the gray value and the Lth of the target frame in the window by using the formula (1)1Euclidean distance of gray value of frame image in initial window
Figure BDA0003591016060000041
Figure BDA0003591016060000042
In the formula (1), j represents the label of the pixel point in the window, and j belongs to [1, a ]2],xi,yiRespectively representing the gray values of pixels in windows corresponding to the target frame and the adjacent frame;
step 4, if
Figure BDA0003591016060000051
Then L is1-1 assignment L1Then, returning to the step 3 for execution, otherwise, executing the step 5; wherein T is a threshold used for measuring similarity, and an optimal value thereof is determined by a series of simulation experiments (comprehensively considering setting the ENL value and ESI value of the filtered image with different thresholds T), and according to the simulation experiment result of fig. 3, the optimal threshold T is determined as 70 to achieve the balance of noise smoothing and detail preservation of the filtered image;
step 5, calculating the gray value and the Lth of the target frame in the window2Euclidean distance of gray value of frame image in window
Figure BDA0003591016060000052
Step 6, if
Figure BDA0003591016060000053
Then L is2+1 assignment L2Then, returning to the step 5 for execution, otherwise, executing the step 7;
step 7, obtaining the subscript L of the frame number finally1And L2Determining a neighboring frame range [ L ] similar to the target frame initial window1,L2];
Step 8, traversing the target frame by using the windows, and obtaining adjacent frame ranges corresponding to the target frame under different windows according to the processes of the step 3 to the step 7;
step 9, establishing a representation time letter by using the formula (2)Time weighted gaussian kernel function w of interestt(i,n):
Figure BDA0003591016060000054
In the formula (2), i is the subscript of the target frame, n is the subscript of the adjacent frame, and n belongs to [ L ∈ [ [ L ]1,L2],σtIs a time similarity diffusion factor;
step 10, as shown in fig. 2, geometric space weight function w based on bilateral filterd(. cndot.) and a grayscale spatial weight function wr(. DEG.), and a time weighted Gaussian kernel function wt(i, n), constructing a trilateral filtering kernel function w using equation (5):
Figure BDA0003591016060000055
Figure BDA0003591016060000056
Figure BDA0003591016060000057
in the formula (5), xn,xn,0Respectively represent adjacent frame ranges L1,L2]Neighborhood pixel point and center pixel point of the nth frame image in the frame, I (x)n),I(xn,0) Neighborhood pixel points x respectively representing nth frame imagenAnd a central pixel point xn,0Gray value of [ omega ]nCentral pixel point x representing nth frame imagen,0A neighborhood pixel point set of (2);
step 11, using equation (6) to align adjacent frame range [ L1,L2]All image frames are fused and normalized to obtain adjacent frame range L1,L2]Fusion target frame I' (x) under the corresponding window:
Figure BDA0003591016060000061
in the formula (6), x represents the adjacent frame range [ L ]1,L2]Any pixel point of the fusion target frame I' (x) under the corresponding window;
step 12, carrying out trilateral filtering on the fusion target frame I' (x) by using the formula (7) to obtain an adjacent frame range [ L ]1,L2]De-noised target frame under corresponding window
Figure BDA0003591016060000062
Thereby obtaining the denoised target frames under different windows and forming a denoised image
Figure BDA0003591016060000063
Figure BDA0003591016060000064
In the formula (7), x' represents the adjacent frame range [ L ]1,L2]The central pixel point, omega, of the fusion target frame I' (x) under the corresponding window0The domain pixel point set is the central pixel point x 'of the fusion target frame I' (x).
So far, the video noise suppression method based on the space-time trilateral filter is basically completed.
The effectiveness of the invention is further illustrated by the following real video SAR data experiments.
1. Experimental setup:
the effectiveness of the proposed denoising method is demonstrated by using real video SAR data published by the sandia national laboratory. This video covers the door to the cutland air force base and is available from their official website. The video SAR data comprises 900 frames, the frame rate is 29 Hz, the video frame rate is high, the lens has no obvious change, and the problem of image registration caused by the change of the visual angle does not need to be considered for adjacent frames, so that the video noise suppression method based on the spatio-temporal trilateral filter is suitable for verification.
2. And (4) analyzing results:
the experiment in this example carried out quantitative analysis on the performance of the method proposed by the present invention using equivalent norm ENL and edge preservation coefficient ESI. In order to illustrate the superiority of the method provided by the invention, several common video SAR image filtering methods are selected for comparison, and the methods comprise traditional Bilateral Filtering (BF), VBM3D (video three-dimensional block matching algorithm) and ATS-RBF (improved bilateral filtering algorithm based on self-adaptive pruning statistics). Wherein:
Figure BDA0003591016060000065
Figure BDA0003591016060000071
in formula (8), e (i) and var (i) represent the mean and variance, respectively, of a given image or region; m in formula (9)i,jThe gray value at the pixel point (i, j) is represented, and the size of the image or the area is m × n.
TABLE 1 Performance evaluation index (ENL) of spatio-temporal trilateral filter-based video noise suppression method
Figure BDA0003591016060000072
Analysis is performed by combining fig. 4 and table 1, as shown in fig. 4, the first row of images is the overall effect of different filtering algorithms, and the second row and the third row of images respectively select a texture detail area with a representative texture detail area in the first row of different filtering results and an image flat area (speckle noise is distributed in a large amount, and no obvious edge and texture detail) for comparison. After the video SAR image is processed by the traditional bilateral filter, the video SAR image is not obviously improved, and a large amount of speckle noise still exists; after the processing of the VBM3D filtering algorithm, the speckle noise of the video SAR image is well smoothed, but meanwhile, the image is blurred by part of important details; although the ATS-RBF filtering method keeps the details of the image to a certain extent while smoothing the speckle noise of the image, the effect of the ATS-RBF filtering method is not as good as that of the trilateral filtering method based on the space-time information. The trilateral filtering method obtains the highest ENL value in four representative image flat areas (speckle noise exists in a large quantity and few details), corresponds to the best image smoothing effect, and simultaneously considers the edge maintenance coefficient ESI of the image in the process of setting the optimal filtering threshold value T, so that the details of the video SAR image can be well kept after filtering. The three-edge filtering method based on the spatio-temporal information has greater advantages in the aspect of processing video SAR images and has certain practical value.

Claims (1)

1. A video noise suppression method based on a space-time trilateral filter is characterized by comprising the following steps:
step 1: acquiring a section of synthetic aperture radar video and splitting the video into N frames of images to obtain a video frame image set X ═ X1,X2,…,Xi,…,XN},XiRepresents the ith frame image, i ∈ [1, N];
Step 2, using the ith frame image XiDefining the index of frame number before the target frame as L1And initializes L1I-1, defining a frame number index L after the target frame2And initializes L2=i+1;
Selecting a window with dimension a multiplied by a from the target frame;
step 3, calculating the gray value and the Lth of the target frame in the window1Euclidean distance of gray values of frame images in the initial window
Figure FDA0003591016050000011
Step 4, if
Figure FDA0003591016050000012
Then L is1-1 assignment L1Then, returning to the step 3 for execution, otherwise, executing the step 5; wherein, the first and the second end of the pipe are connected with each other,t is a threshold used to measure similarity;
step 5, calculating the gray value and the Lth of the target frame in the window2Euclidean distance of gray value of frame image in the window
Figure FDA0003591016050000013
Step 6, if
Figure FDA0003591016050000014
Then L is2+1 assignment L2Then, returning to the step 5 for execution, otherwise, executing the step 7;
step 7, obtaining frame number subscript L finally1And L2Determining a neighboring frame range [ L ] similar to the target frame initial window1,L2];
Step 8, traversing the target frame by using a window, and obtaining adjacent frame ranges corresponding to the target frame under different windows according to the processes of the step 3 to the step 7;
step 9, establishing a time weight Gaussian kernel function w for representing time information by using the formula (1)t(i,n):
Figure FDA0003591016050000015
In formula (1), i is the index of the target frame, n is the index of the adjacent frame, and n belongs to [ L ]1,L2],σtIs a time similarity diffusion factor;
step 10, geometric space weight function w based on bilateral filterd(. cndot.) and a grayscale spatial weight function wr(. DEG.), and a time weighted Gaussian kernel function wt(i, n), constructing a trilateral filtering kernel function w using equation (2):
Figure FDA0003591016050000016
in the formula (2), xn,xn,0Respectively representing adjacent frame ranges [ L1,L2]Neighborhood pixel point and center pixel point of the nth frame image in the frame I (x)n),I(xn,0) Neighborhood pixel points x respectively representing nth frame imagenAnd a central pixel point xn,0Gray value of [ omega ]nCentral pixel point x representing nth frame imagen,0A neighborhood pixel point set of (2);
step 11, utilizing the formula (3) to carry out the adjacent frame range [ L1,L2]All image frames are fused and normalized to obtain adjacent frame range L1,L2]Fusion target frame I' (x) under the corresponding window:
Figure FDA0003591016050000021
in the formula (3), x represents the adjacent frame range [ L ]1,L2]Any pixel point of the fusion target frame I' (x) under the corresponding window;
step 12, performing trilateral filtering on the updated target frame I' (x) by using the formula (4) to obtain an adjacent frame range [ L ]1,L2]De-noised target frame under corresponding window
Figure FDA0003591016050000022
Thereby obtaining the denoised target frames under different windows and forming a denoised image
Figure FDA0003591016050000023
Figure FDA0003591016050000024
In the formula (4), x' represents the adjacent frame range [ L ]1,L2]De-noised target frame under corresponding window
Figure FDA0003591016050000025
Of the central pixelPoint, omega0To update the domain pixel set of the center pixel x 'of the target frame I' (x).
CN202210378103.6A 2022-04-12 2022-04-12 Video noise suppression method based on space-time trilateral filter Active CN114740476B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210378103.6A CN114740476B (en) 2022-04-12 2022-04-12 Video noise suppression method based on space-time trilateral filter

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210378103.6A CN114740476B (en) 2022-04-12 2022-04-12 Video noise suppression method based on space-time trilateral filter

Publications (2)

Publication Number Publication Date
CN114740476A true CN114740476A (en) 2022-07-12
CN114740476B CN114740476B (en) 2024-04-26

Family

ID=82281531

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210378103.6A Active CN114740476B (en) 2022-04-12 2022-04-12 Video noise suppression method based on space-time trilateral filter

Country Status (1)

Country Link
CN (1) CN114740476B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103472450A (en) * 2013-09-18 2013-12-25 哈尔滨工业大学 Non-uniform space configuration distributed SAR moving target three-dimensional imaging method based on compressed sensing
CN108805835A (en) * 2018-05-30 2018-11-13 合肥工业大学 Based on the SAR image bilateral filtering method for blocking statistical nature
CN109767400A (en) * 2019-01-14 2019-05-17 三峡大学 A kind of ultrasound image speckle noise minimizing technology of three sides of guiding filtering
US20210125318A1 (en) * 2019-10-29 2021-04-29 Visidon Oy Image processing method and apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103472450A (en) * 2013-09-18 2013-12-25 哈尔滨工业大学 Non-uniform space configuration distributed SAR moving target three-dimensional imaging method based on compressed sensing
CN108805835A (en) * 2018-05-30 2018-11-13 合肥工业大学 Based on the SAR image bilateral filtering method for blocking statistical nature
CN109767400A (en) * 2019-01-14 2019-05-17 三峡大学 A kind of ultrasound image speckle noise minimizing technology of three sides of guiding filtering
US20210125318A1 (en) * 2019-10-29 2021-04-29 Visidon Oy Image processing method and apparatus

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
KOSTADIN DABOV ET AL.: "Image Denoising by Sparse 3-D Transform-Domain Collaborative Filtering", 《IEEE TRANSACTIONS ON IMAGE PROCESSING》, vol. 16, no. 8, 31 August 2007 (2007-08-31), pages 2080 - 2095, XP011187305, DOI: 10.1109/TIP.2007.901238 *
艾加秋 等: "基于背景匀质性双边滤波的SAR图像斑点噪声抑制算法", 《通感学报》, 31 December 2021 (2021-12-31), pages 1071 - 1084 *

Also Published As

Publication number Publication date
CN114740476B (en) 2024-04-26

Similar Documents

Publication Publication Date Title
CN108921800B (en) Non-local mean denoising method based on shape self-adaptive search window
CN109360156B (en) Single image rain removing method based on image block generation countermeasure network
WO2021217643A1 (en) Method and device for infrared image processing, and movable platform
Li et al. Single image rain streak decomposition using layer priors
CN109636766B (en) Edge information enhancement-based polarization difference and light intensity image multi-scale fusion method
CN108805835B (en) SAR image bilateral filtering method based on truncation statistical characteristics
CN110675340A (en) Single image defogging method and medium based on improved non-local prior
CN109712149B (en) Image segmentation method based on wavelet energy and fuzzy C-means
Ding et al. U 2 D 2 Net: Unsupervised unified image dehazing and denoising network for single hazy image enhancement
CN110147816B (en) Method and device for acquiring color depth image and computer storage medium
Estrada et al. Stochastic Image Denoising.
CN114549492A (en) Quality evaluation method based on multi-granularity image information content
Xu et al. Remote sensing image denoising using patch grouping-based nonlocal means algorithm
CN103971345A (en) Image denoising method based on improved bilateral filtering
CN116051415B (en) Video SAR sequential image speckle filtering method based on super-pixel segmentation
CN113421210A (en) Surface point cloud reconstruction method based on binocular stereo vision
CN111311508B (en) Noise reduction method for pavement crack image with noise
CN112819739A (en) Scanning electron microscope image processing method and system
Fazlali et al. Aerial image dehazing using a deep convolutional autoencoder
CN113759375B (en) SAR image non-local mean denoising method based on statistical characteristics
CN114740476B (en) Video noise suppression method based on space-time trilateral filter
CN111461999A (en) SAR image speckle suppression method based on super-pixel similarity measurement
CN116342519A (en) Image processing method based on machine learning
CN112991326B (en) Cleaning quality evaluation method
CN115937302A (en) Hyperspectral image sub-pixel positioning method combined with edge preservation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant