CN111344736A - Image processing method, image processing device and unmanned aerial vehicle - Google Patents

Image processing method, image processing device and unmanned aerial vehicle Download PDF

Info

Publication number
CN111344736A
CN111344736A CN201880069922.4A CN201880069922A CN111344736A CN 111344736 A CN111344736 A CN 111344736A CN 201880069922 A CN201880069922 A CN 201880069922A CN 111344736 A CN111344736 A CN 111344736A
Authority
CN
China
Prior art keywords
image
result
ith
filtering
frequency
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880069922.4A
Other languages
Chinese (zh)
Inventor
李静
袁一璟
彭亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN111344736A publication Critical patent/CN111344736A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20192Edge enhancement; Edge preservation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The present disclosure relates to an image processing method, including: down-sampling the original image for n-1 times to determine images with n resolutions; performing edge-preserving filtering on each image to obtain high-frequency information and a filtering result; fusing the high-frequency information of the ith image and the high-frequency information of the (i-1) th image to obtain an (i-1) th high-frequency fusion result; and according to the ith-1 high-frequency fusion result, fusing the filtering result of the ith image and the filtering result of the ith-1 image to obtain an ith-1 filtering fusion result. According to the embodiment of the disclosure, more details in a high-frequency region of an image can be reserved, and the denoising effect in a low-frequency region is higher.

Description

Image processing method, image processing device and unmanned aerial vehicle
Technical Field
The invention relates to the technical field of image processing, in particular to an image processing method, an image processing device and an unmanned aerial vehicle.
Background
In order to obtain a good display effect, the image needs to be denoised to remove noise in the image.
However, based on the current image denoising method, if a better denoising effect is to be achieved, more detailed information of the image is lost, and if more detailed information of the image needs to be retained, the denoising effect is weakened.
Disclosure of Invention
The invention provides a through image processing method, an image processing device and an unmanned aerial vehicle, and aims to solve the technical problems in the related art.
According to a first aspect of the embodiments of the present disclosure, an image processing method is provided, including:
the method comprises the steps of performing n-1 times of downsampling on an original image to determine n resolution ratio images, wherein the resolution ratio of an ith image is smaller than that of an i-1 th image, i is larger than 1 and smaller than or equal to n, and the 1 st image is the original image;
performing edge-preserving filtering on each image to obtain high-frequency information and a filtering result;
respectively executing the following steps on the 2 nd to the nth images until a 1 st filtering fusion result is obtained:
fusing the high-frequency information of the ith image and the high-frequency information of the (i-1) th image to obtain an (i-1) th high-frequency fusion result, wherein when i is less than n, the (i) th high-frequency fusion result is used as the high-frequency information of the ith image;
and according to the ith-1 high-frequency fusion result, fusing the filtering result of the ith image and the filtering result of the ith-1 image to obtain an ith-1 filtering fusion result, wherein when i is less than n, the ith filtering fusion result is used as the filtering result of the ith image.
According to a second aspect of the embodiments of the present disclosure, an image processing apparatus is provided, which includes a processor configured to,
the method comprises the steps of performing n-1 times of downsampling on an original image to determine n resolution ratio images, wherein the resolution ratio of an ith image is smaller than that of an i-1 th image, i is larger than 1 and smaller than or equal to n, and the 1 st image is the original image;
performing edge-preserving filtering on each image to obtain high-frequency information and a filtering result;
respectively executing the following steps on the 2 nd to the nth images until a 1 st filtering fusion result is obtained:
fusing the high-frequency information of the ith image and the high-frequency information of the (i-1) th image to obtain an (i-1) th high-frequency fusion result, wherein when i is less than n, the (i) th high-frequency fusion result is used as the high-frequency information of the ith image;
and according to the ith-1 high-frequency fusion result, fusing the filtering result of the ith image and the filtering result of the ith-1 image to obtain an ith-1 filtering fusion result, wherein when i is less than n, the ith filtering fusion result is used as the filtering result of the ith image.
According to a third aspect of the embodiments of the present disclosure, an unmanned aerial vehicle is provided, which includes the image processing apparatus of the above embodiments.
According to the embodiment of the disclosure, since the ith image is obtained by downsampling based on the (i-1) th image, and the resolution of the downsampled ith image is smaller than that of the (i-1) th image, which may result in losing part of high-frequency information, so that a better noise reduction effect is achieved, the filtering result of the ith image is less than that of the (i-1) th image, and the probability that a pixel with a larger high-frequency fusion result corresponds to a pixel in the filtered image as an edge is higher, so that more high-frequency information may be retained, so as to enable the edge to be correctly extracted.
Based on the embodiment, the filtering result of the ith image and the filtering result of the (i-1) th image can be fused according to the (i-1) th high-frequency fusion result, for example, the (i-1) th high-frequency fusion result is set to be inversely related to the weight of the filtering result of the ith image and positively related to the weight of the filtering result of the (i-1) th image, so that more details can be retained in the (i-1) th filtering and fusion result in a high-frequency region, and the denoising effect in a low-frequency region is higher.
And when i is less than n, taking the ith filtering and fusing result as the filtering result of the ith image, and continuing to execute the last two steps in the embodiment on the 2 nd to the n-1 th images, so as to obtain the ith-2 filtering and fusing result according to the ith-1 filtering and fusing result, obtain the ith-3 filtering and fusing result according to the ith-2 filtering and fusing result, …, until the 1 st filtering and fusing result is obtained, so that the 1 st filtering and fusing result simultaneously realizes more details in a high-frequency area and has a higher denoising effect in a low-frequency area compared with the original image.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
Fig. 1 is a schematic flow chart diagram illustrating an image processing method according to an embodiment of the present disclosure.
FIG. 2 is a schematic flow chart illustrating a process of fusing the high-frequency information of the ith image and the high-frequency information of the (i-1) th image to obtain an (i-1) th high-frequency fusion result according to an embodiment of the disclosure.
FIG. 3 is a schematic flow chart of the fusion of the filtering result of the ith image and the filtering result of the ith-1 image according to the ith-1 high-frequency fusion result to obtain the ith-1 filtering fusion result.
FIG. 4 is a schematic flow chart of determining the i-1 st second weight corresponding to the i-1 st high frequency fusion result.
Fig. 5 is a schematic flow chart diagram illustrating another image processing method according to an embodiment of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention. In addition, the features in the embodiments and the examples described below may be combined with each other without conflict.
Fig. 1 is a schematic flow chart diagram illustrating an image processing method according to an embodiment of the present disclosure. The image processing method shown in this embodiment may be applied to a terminal, such as a mobile phone, a tablet computer, a wearable device, and the like, may also be applied to a server, and may also be applied to other devices having a data processing function, such as an unmanned aerial vehicle.
As shown in fig. 1, the image processing method may include the steps of:
step S1, the original image is downsampled for n-1 times to determine the image with n resolutions, wherein the resolution of the ith image is smaller than that of the ith-1 image, i is larger than 1 and smaller than or equal to n, and the 1 st image is the original image.
In one embodiment, an image with a low resolution can be obtained by down-sampling the original image, and the rate of the resolution reduction of the image after each down-sampling may be the same or different from that before the down-sampling. The following exemplary explanation is mainly made in the case where the ratio of the resolution reduction of the image is the same.
For example, if n is 4, the original image is downsampled 3 times, and after each downsampling, the ratio of the reduction in the resolution of the image before downsampling is 1/2, that is, the resolution of the image after downsampling is 1/2 of the resolution of the image before downsampling. Then the resolution of the 2 nd image d2 after the first down-sampling is 1/2 of the resolution of the original image d1, the resolution of the 3 rd image d3 after the second down-sampling is 1/2 of the resolution d2, and the resolution of the 4 th image d4 after the third down-sampling is 1/2 of the resolution d 3.
The following description will mainly exemplify an embodiment of the present disclosure in a case where n is 4, and the ratio of the resolution reduction of an image after each down-sampling to before the down-sampling is 1/2.
In step S2, edge-preserving filtering is performed on each image to obtain high-frequency information and filtering results.
In an embodiment, edge-preserving filtering may be performed on each image, for example, the above-mentioned d1, d2, d3, and d4, respectively, where the edge-preserving filtering refers to a filtering manner capable of effectively preserving edge information in the image in a filtering process, and for example, may be bilateral filtering, guided filtering, weighted least squares filtering, and the like.
After the image is filtered through edge-preserving filtering, high-frequency information and a filtering result of the image can be obtained.
The high-frequency information of a certain region may indicate whether the region has a severe change, specifically, whether the region has more textures, for example, the larger the high-frequency information corresponding to a certain pixel is, the more likely the pixel belongs to the edge of an object in an image.
The filtering result is the result of removing some noise and details from the pre-filtered image, and theoretically the signal-to-noise ratio of the filtering result is higher than that of the pre-filtered image. The filtering result of edge-preserving filtering on the down-sampled image can reflect the low-frequency information of the image before down-sampling, and the low-frequency information of a certain area can express whether the area changes smoothly, specifically, whether a large number of color blocks exist in the area.
For convenience of description, the high frequency information of the 4 th image d4 is denoted as d4_ diff, and the filtering result is d4_ filter; the high-frequency information of the 3 rd image d3 is d3_ diff, and the filtering result is d3_ filter; the high-frequency information of the 2 nd image d2 is d2_ diff, and the filtering result is d2_ filter; the high frequency information of the 1 st image d1 is d1_ diff, and the filtering result is d1_ filter.
Respectively executing the following steps on the 2 nd to the nth images until a 1 st filtering fusion result is obtained:
step S3, fusing the high-frequency information of the ith image and the high-frequency information of the (i-1) th image to obtain an (i-1) th high-frequency fusion result, wherein when i is less than n, the (i) th high-frequency fusion result is used as the high-frequency information of the ith image;
and step S4, according to the ith-1 high-frequency fusion result, fusing the filtering result of the ith image and the filtering result of the ith-1 image to obtain an ith-1 filtering fusion result, and when i is less than n, taking the ith filtering fusion result as the filtering result of the ith image.
In one embodiment, the high-frequency information of the ith image and the high-frequency information of the (i-1) th image are fused to obtain an (i-1) th high-frequency fusion result from the nth image. For example, if n is 4, the high-frequency information d4_ diff of the 4 th image and the high-frequency information d3_ diff of the 3 rd image d3 are fused from the 4 th image d4, and a 3 rd high-frequency fusion result d3_ diff _ merge is obtained.
When i < n, the ith high-frequency fusion result is used as the high-frequency information of the ith image, for example, when i is 3, the 3 rd high-frequency fusion result d3_ diff _ merge is used as the high-frequency information d3_ diff of the 3 rd image, and when i is 2, the 2 nd high-frequency fusion result d2_ diff _ merge is used as the high-frequency information d2_ diff of the 2 nd image.
After the high frequency information d2_ diff of the 2 nd image is obtained, the high frequency information d2_ diff of the 2 nd image and the high frequency information d1_ diff of the 1 st image may be further fused to obtain a 1 st high frequency fusion result d1_ diff _ merge.
Since the image noise is reduced by down-sampling, the 1 st high frequency fusion result obtained by successively fusing the high frequency information of the images with different resolutions can reduce the noise interference while retaining the high frequency information, and improve the accuracy of edge detection, compared with the original image, that is, the high frequency information of the 1 st image.
And further, according to the ith-1 high-frequency fusion result, fusing the filtering result of the ith image and the filtering result of the ith-1 image to obtain an ith-1 filtering fusion result, for example, the fusing refers to weighted summation, so that the weight of the filtering result of the ith-1 high-frequency fusion result and the weight of the filtering result of the ith image are inversely related and positively related, so that the weight of the filtering result of the ith image corresponding to the pixel with the larger high-frequency fusion result is smaller, and the weight corresponding to the filtering result of the ith-1 image is larger.
Since the ith image is obtained by downsampling based on the (i-1) th image, and the resolution of the downsampled ith image is smaller than that of the (i-1) th image, partial high-frequency information is lost, so that a good noise reduction effect is achieved, the filtering result of the ith image is less than that of the (i-1) th image, and the probability that the pixel with the larger high-frequency fusion result corresponds to the pixel in the filtered image is an edge is higher, so that more high-frequency information can be reserved, and the edge can be correctly extracted.
Based on the embodiment, the filtering result of the ith image and the filtering result of the ith-1 image can be fused according to the ith-1 high-frequency fusion result, for example, the weight of the filtering result of the ith image is inversely related to the weight of the filtering result of the ith image, and the weight of the filtering result of the ith-1 image is positively related to the filtering result of the ith-1 image. When the i-1 high-frequency fusion result is larger, the possibility that the pixel is high-frequency information is high, and the pixel corresponds to the filtering result, so that the weight of the filtering result of the i-1 image is larger, and the weight of the filtering result of the i-1 image is smaller; when the i-1 high-frequency fusion result is smaller, the possibility that the pixel is low-frequency information is high, and the pixel corresponds to the filtering result, so that the weight of the filtering result of the i-1 image is smaller, and the weight of the filtering result of the i-1 image is larger; therefore, the i-1 filtering fusion result can be simultaneously realized to keep more details in a high-frequency region and have higher denoising effect in a low-frequency region.
Further, when i is less than n, the ith high-frequency fusion result is used as the high-frequency information of the ith image, and the ith filtering fusion result is used as the filtering result of the ith image, so that in the process of executing the steps S3 and S4 on the 2 nd to the n-1 th images, the high-frequency information of the i-1 st image and the high-frequency information of the i-2 nd image can be fused to obtain an i-2 nd high-frequency fusion result, the high-frequency information of the i-2 nd image and the high-frequency information of the i-3 rd image are fused to obtain an i-3 rd high-frequency fusion result, …, and the 1 st high-frequency fusion result is finally obtained; in addition, the filtering result of the ith image and the filtering result of the (i-1) th image can be fused to obtain an (i-1) th filtering fusion result, the filtering result of the (i-1) th image and the filtering result of the (i-2) th image are fused to obtain an (i-2) th filtering fusion result, …, and finally a 1 st filtering fusion result is obtained, wherein compared with the original image, the 1 st filtering fusion result simultaneously realizes that more details are reserved in a high-frequency area, and the denoising effect in a low-frequency area is higher.
On the basis of the embodiment shown in fig. 1, fig. 2 is a schematic flow chart showing a method for obtaining an i-1 high-frequency fusion result by fusing the high-frequency information of the i-th image and the high-frequency information of the i-1 th image according to an embodiment of the disclosure. As shown in fig. 2, the obtaining of the i-1 high frequency fusion result by fusing the i-th image high frequency information and the i-1 th image high frequency information includes:
step S31, the high frequency information of the ith image is up-sampled to obtain the ith high frequency up-sampling result;
and step S32, according to the i-1 st weight, carrying out weighted summation on the i-th high-frequency up-sampling result and the i-1 st image high-frequency information to obtain an i-1 st high-frequency fusion result.
In one embodiment, since the resolution of the ith image is smaller than that of the (i-1) th image, the resolution of the high frequency information of the ith image is smaller than that of the high frequency information of the (i-1) th image, and in order to perform weighted summation on the high frequency information of the ith image and the high frequency information of the (i-1) th image, the resolution of the high frequency information of the ith image needs to be increased, wherein the ith high frequency up-sampling result can be obtained by up-sampling the high frequency information of the ith image, and the resolution of the ith high frequency up-sampling result is the same as that of the high frequency information of the (i-1) th image.
For example, for the high frequency information d4_ diff of the 4 th image, up-sampling is performed to obtain a 4 th high frequency up-sampled result d4_ up _ diff, and the resolution of d4_ up _ diff is the same as that of the high frequency information d3_ diff of the 3 rd image.
Further, according to the i-1 th first weight, the ith high-frequency upsampling result and the high-frequency information of the i-1 st image are weighted and summed to obtain an i-1 th high-frequency fusion result, the adopted i-1 th first weight may be set as required, for example, when i is equal to 4, the 3 rd first weight is w3, and the 3 rd high-frequency fusion result d3_ diff _ merge is calculated as follows:
d3_diff_merge=d3_diff×w3+d4_up_diff×(1-w3);
because the ith image is obtained based on the i-1 th image downsampling, and the resolution of the downsampled ith image is smaller than that of the i-1 th image, the high-frequency information of the i-1 th image can express more high-frequency information relative to the high-frequency information of the ith image, and under the condition, the denoising effect and the detail retention degree in the fusion process can be adjusted by setting the i-1 th first weight. In general, the first weight value is a preset value, the first weight value w (i-1) can be determined according to the signal-to-noise ratio of the original image, and if the signal-to-noise ratio of the original image is higher, w (i-1) can be a larger value. Generally, w (n-1), w (n-2) … … w0 can be set to decrease in sequence.
For example, if the i-1 st weight is set to be larger, the high-frequency information of the i-1 st image can be fused more, so that more high-frequency information with higher frequency can be retained, that is, the detail retention degree in the fusion process is higher; correspondingly, if the ith-1 first weight is set to be smaller, the ith high-frequency up-sampling result can be fused more, so that less high-frequency information of higher frequency can be reserved, and the denoising effect is relatively better.
It should be noted that, for an i-1 th high frequency fusion result, it may be calculated by an i-1 th first weight, for example, in d3_ diff _ merge d3_ diff × w3+ d4_ up _ diff × (1-w3), w3 is a fixed value for each pixel in d3_ diff and each d4_ up _ diff, however, the i-1 th first weight may also be set to be variable as needed, wherein different i-1 th first weights may be set for pixels in different positions, for example, in d3_ diff _ merge d3_ diff × w3+ d4_ up _ diff × (1-w3), and may be different for each pixel in d3_ diff and each d4_ up _ diff based on the position of the pixel 3.
On the basis of the embodiment shown in fig. 1 or fig. 2, fig. 3 is a schematic flow chart for obtaining the i-1 filtering and fusing result by fusing the filtering result of the i-th image and the filtering result of the i-1-th image according to the i-1 high-frequency fusing result. As shown in fig. 3, the fusing the filtering result of the ith image and the filtering result of the (i-1) th image according to the (i-1) th high-frequency fusion result to obtain the (i-1) th filtering fusion result includes:
step S41, the filtering result of the ith image is up-sampled to obtain the ith filtering up-sampling result;
step S42, determining an i-1 th second weight corresponding to the i-1 th high-frequency fusion result, wherein the i-1 th high-frequency fusion result is positively correlated with the i-1 th second weight;
and step S43, weighting the ith filtering and up-sampling result according to the ith-1 second weight, weighting the filtering result of the (i-1) th image according to the difference between 1 and the ith-1 second weight, and obtaining the ith-1 filtering and fusing result according to the weighted sum result.
In one embodiment, since the resolution of the ith image is less than the resolution of the (i-1) th image, the resolution of the filtering result of the ith image is less than the resolution of the filtering result of the (i-1) th image, and in order to perform weighted summation on the filtering result of the ith image and the filtering result of the (i-1) th image, the resolution of the filtering result of the ith image needs to be increased, wherein the ith filtering upsampling result can be obtained by upsampling the filtering result of the ith image, and the resolution of the ith filtering upsampling result is the same as the resolution of the filtering result of the (i-1) th image. For example, for the filtering result d4_ filter of the 4 th image, upsampling is performed to obtain a 4 th high frequency upsampling result d4_ up _ filter, and the resolution of d4_ up _ filter is the same as that of the filtering result d4_ filter of the 3 rd image.
Further, an i-1 th second weight corresponding to the i-1 th high-frequency fusion result can be determined, and the i-1 th high-frequency fusion result is inversely correlated with the weight of the filtering result of the i-1 th image and positively correlated with the weight of the filtering result of the i-1 th image, so that the weight of the filtering result of the i-th image corresponding to the pixel with the larger high-frequency fusion result is smaller, and the weight corresponding to the filtering result of the i-1 th image is larger.
For example, in the case of n being 4, the 3 rd second weight d3_ weight may be determined according to the 3 rd high frequency fusion result d3_ diff _ merge, where the determined d3_ weight is different for pixels with different d3_ diff _ merge, and then for the pixel (i, j) in the 3 rd image d3, its corresponding 4 th filtered upsampling result d4_ up _ filter (i, j), and the corresponding 3 rd high frequency fusion result d3_ diff _ merge (i, j), then the determined 3 rd second weight is d3_ weight (i, j), and the i-1 st high frequency fusion result d3_ filter _ new (i, j) of the pixel (i, j) is equal to:
d3_filter(i,j)*d3_weight(i,j)+d4_up_filter(i,j)*(1-d3_weight(i,j))
based on the embodiment of fig. 1, since the 4 th image is obtained by downsampling based on the 3 rd image, and the resolution of the downsampled 4 th image is smaller than that of the 3 rd image, which may result in losing part of the high frequency information, and thus has a better noise reduction effect, the filtering result d4_ filter of the 4 th image is less than the filtering result d3_ filter high frequency information of the 3 rd image, and the pixel with the larger high frequency fusion result has a higher probability of corresponding to the pixel in the filtered image being an edge, so that more high frequency information may be retained, so as to enable the edge to be correctly extracted.
Based on the embodiment, the filtering result of the 4 th image and the filtering result of the 3 rd image can be weighted and summed according to the 3 rd high-frequency fusion result, for example, the 3 rd high-frequency fusion result and the weight of the filtering result of the 4 th image are set to be inversely related and positively related to the weight of the filtering result of the 3 rd image, so that more details are retained in the high-frequency region for the 3 rd filtering fusion result, and the denoising effect in the low-frequency region is higher.
When i is less than 4, taking the ith high-frequency fusion result as the high-frequency information of the ith image, and taking the ith filtering fusion result as the filtering result of the ith image, so that in the process of performing the above steps S3 and S4 on the 2 nd to 3 rd images, the high-frequency information of the 3 rd image and the high-frequency information of the 2 nd image can be weighted and summed to obtain a 2 nd high-frequency fusion result, and a 2 nd second weight d2_ weight is determined according to the 2 nd high-frequency fusion result, and then the high-frequency information of the 2 nd image and the high-frequency information of the 1 st image are weighted and summed to obtain a 1 st high-frequency fusion result, and a 1 st second weight d1_ weight is determined according to the 1 st high-frequency fusion result; in addition, the filtering result of the 3 rd image and the filtering result of the 2 nd image may be weighted and summed based on the 2 nd second weight d2_ weight to obtain a 2 nd filtering fusion result, and then the filtering result of the 2 nd image and the filtering result of the 1 st image are weighted and summed based on the 1 st second weight d1_ weight to finally obtain a 1 st filtering fusion result, where the 1 st filtering fusion result is relatively to the original image, and as for the 1 st filtering fusion result, more details are simultaneously retained in a high frequency region, and the denoising effect in a low frequency region is relatively high.
Optionally, the i-1 high frequency fusion result is inversely related to the weight of the filtering result of the i-th image, and is positively related to the weight of the filtering result of the i-1 th image.
It should be noted that, the positive correlation in the embodiments of the present disclosure means that when a and B are positively correlated, a increases with the increase of B in the overall trend, and in the local interval, a may remain unchanged with the increase of B; the positive correlation in the embodiment of the present disclosure means that when a and B are inversely correlated, a increases and decreases with the increase of B in the overall trend, and in the local region, a may remain unchanged with the increase of B.
Based on the embodiment shown in FIG. 3, FIG. 4 is a schematic flow chart of determining the i-1 th second weight corresponding to the i-1 th high-frequency fusion result. As shown in fig. 4, the determining the i-1 th second weight corresponding to the i-1 th high-frequency fusion result includes:
and step S421, determining the (i-1) th second weight corresponding to the (i-1) th high-frequency fusion result according to the association relation table of the (i-1) th second weight and the (i-1) th high-frequency fusion result.
In one embodiment, the i-1 th second weight and the i-1 th high-frequency fusion result may be stored through an association table, in the association table, different i-1 th high-frequency fusion results may correspond to different i-1 th second weights, so that the i-1 th second weight corresponding to the i-1 th high-frequency fusion result may be subsequently queried according to the association table.
Optionally, the down-sampling comprises at least one of:
gaussian downsampling, mean downsampling, maximum or minimum downsampling and median downsampling.
In one embodiment, the mode of downsampling used may be selected as desired, and the mode of each downsampling operation may be the same or different.
Optionally, the means for upsampling comprises at least one of:
nearest neighbor, bilinear, cubic.
In one embodiment, the manner of upsampling employed may be selected as desired, and the manner of each upsampling operation may be the same or different.
Optionally, the edge-preserving filtering mode includes at least one of:
bilateral filtering, guided filtering, weighted least square filtering.
In one embodiment, the manner of edge preserving filtering employed may be selected as desired.
On the basis of any of the above embodiments, fig. 5 is a schematic flow chart illustrating another image processing method according to an embodiment of the present disclosure. As shown in fig. 5, before down-sampling the original image n-1 times to determine images of n resolutions, the method further includes:
in step S5, the value of n is determined according to the resolution of the original image.
In an embodiment, a value of n may be determined based on the resolution of the original image, and then the number of down-sampling times n-1 is determined, for example, the larger the resolution of the original image is, the larger n may be, that is, the larger the number of down-sampling times n-1 may be, and then it may be ensured that low-frequency information with a sufficiently low frequency is obtained, so as to ensure that the fused result has a better denoising effect.
Optionally, the high frequency information comprises a mean of a sum of absolute values of pixel value differences for each pixel in the corresponding image and for pixels in the respective neighborhood.
In one embodiment, for each pixel in the image, a mean value of the sum of absolute values of pixel value differences of the pixel and pixels in the neighborhood may be calculated, and the mean value may express a difference between the pixel and the pixel value in the neighborhood.
The present disclosure also proposes an embodiment of an image processing apparatus, corresponding to the embodiment of the image processing method described above.
An embodiment of the present disclosure provides an image processing apparatus, including a processor configured to,
the method comprises the steps of performing n-1 times of downsampling on an original image to determine n resolution ratio images, wherein the resolution ratio of an ith image is smaller than that of an i-1 th image, i is larger than 1 and smaller than or equal to n, and the 1 st image is the original image;
performing edge-preserving filtering on each image to obtain high-frequency information and a filtering result;
respectively executing the following steps on the 2 nd to the nth images until a 1 st filtering fusion result is obtained:
fusing the high-frequency information of the ith image and the high-frequency information of the (i-1) th image to obtain an (i-1) th high-frequency fusion result, wherein when i is less than n, the (i) th high-frequency fusion result is used as the high-frequency information of the ith image;
and according to the ith-1 high-frequency fusion result, fusing the filtering result of the ith image and the filtering result of the ith-1 image to obtain an ith-1 filtering fusion result, wherein when i is less than n, the ith filtering fusion result is used as the filtering result of the ith image.
In one embodiment, the processor is configured to,
up-sampling the high-frequency information of the ith image to obtain an ith high-frequency up-sampling result;
and according to the i-1 first weight, carrying out weighted summation on the i-th high-frequency up-sampling result and the i-1-th image high-frequency information to obtain an i-1-th high-frequency fusion result.
In one embodiment, the processor is configured to,
up-sampling the filtering result of the ith image to obtain an ith filtering up-sampling result;
determining an i-1 th second weight corresponding to the i-1 th high-frequency fusion result, wherein the i-1 th high-frequency fusion result is positively correlated with the i-1 th second weight;
weighting the ith filtering and up-sampling result according to the ith-1 second weight value, weighting the filtering result of the (i-1) th image according to the difference between the 1 th weight value and the ith-1 second weight value, and obtaining the ith-1 filtering and fusing result according to the weighted summation result.
In one embodiment, the (i-1) th high-frequency fusion result is inversely related to the weight of the filtering result of the (i) th image and positively related to the weight of the filtering result of the (i-1) th image.
In one embodiment, the processor is configured to,
and determining the (i-1) th second weight corresponding to the (i-1) th high-frequency fusion result according to the association relation table of the (i-1) th second weight and the (i-1) th high-frequency fusion result.
In one embodiment, the down-sampling comprises at least one of:
gaussian downsampling, mean downsampling, maximum or minimum downsampling and median downsampling.
In one embodiment, the manner of upsampling includes at least one of:
nearest neighbor, bilinear, cubic.
In one embodiment, the manner of edge-preserving filtering includes at least one of:
bilateral filtering, guided filtering, weighted least square filtering.
In one embodiment, the processor is further configured to determine a value of n based on a resolution of the original image.
In one embodiment, the high frequency information comprises a mean of a sum of absolute pixel value differences for each pixel in the corresponding image and for pixels in the respective neighborhood.
The embodiment of the disclosure provides an unmanned aerial vehicle, which comprises the image processing device in any one of the embodiments.
The systems, devices, modules or units illustrated in the above embodiments may be implemented by a computer chip or an entity, or by a product with certain functions. For convenience of description, the above devices are described as being divided into various units by function, and are described separately. Of course, the functionality of the units may be implemented in one or more software and/or hardware when implementing the present application. As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (21)

1. An image processing method, comprising:
the method comprises the steps of performing n-1 times of downsampling on an original image to determine n resolution ratio images, wherein the resolution ratio of an ith image is smaller than that of an i-1 th image, i is larger than 1 and smaller than or equal to n, and the 1 st image is the original image;
performing edge-preserving filtering on each image to obtain high-frequency information and a filtering result;
respectively executing the following steps on the 2 nd to the nth images until a 1 st filtering fusion result is obtained:
fusing the high-frequency information of the ith image and the high-frequency information of the (i-1) th image to obtain an (i-1) th high-frequency fusion result, wherein when i is less than n, the (i) th high-frequency fusion result is used as the high-frequency information of the ith image;
and according to the ith-1 high-frequency fusion result, fusing the filtering result of the ith image and the filtering result of the ith-1 image to obtain an ith-1 filtering fusion result, wherein when i is less than n, the ith filtering fusion result is used as the filtering result of the ith image.
2. The method according to claim 1, wherein the fusing the high frequency information of the ith image and the high frequency information of the (i-1) th image to obtain an (i-1) th high frequency fusion result comprises:
up-sampling the high-frequency information of the ith image to obtain an ith high-frequency up-sampling result;
and according to the i-1 first weight, carrying out weighted summation on the i-th high-frequency up-sampling result and the i-1-th image high-frequency information to obtain an i-1-th high-frequency fusion result.
3. The method according to claim 1, wherein the fusing the filtering result of the ith image and the filtering result of the ith-1 image according to the ith-1 high-frequency fusion result to obtain the ith-1 filtering fusion result comprises:
up-sampling the filtering result of the ith image to obtain an ith filtering up-sampling result;
determining an ith second weight corresponding to the ith-1 high-frequency fusion result, wherein the ith-1 high-frequency fusion result is positively correlated with the ith-1 second weight;
weighting the ith filtering and up-sampling result according to the ith-1 second weight value, weighting the filtering result of the (i-1) th image according to the difference between the 1 th weight value and the ith-1 second weight value, and obtaining the ith-1 filtering and fusing result according to the weighted summation result.
4. The method of claim 3, wherein the (i-1) th high frequency fusion result is inversely related to the weight of the filtering result of the (i) th image and positively related to the weight of the filtering result of the (i-1) th image.
5. The method according to claim 3, wherein the determining the i-1 th second weight corresponding to the i-1 th high frequency fusion result comprises:
and determining the (i-1) th second weight corresponding to the (i-1) th high-frequency fusion result according to the association relation table of the (i-1) th second weight and the (i-1) th high-frequency fusion result.
6. The method of any of claims 1 to 5, wherein the down-sampling comprises at least one of:
gaussian downsampling, mean downsampling, maximum or minimum downsampling and median downsampling.
7. The method of any one of claims 1 to 5, wherein the manner of upsampling comprises at least one of:
nearest neighbor, bilinear, cubic.
8. The method according to any one of claims 1 to 5, wherein the edge-preserving filtering comprises at least one of:
bilateral filtering, guided filtering, weighted least square filtering.
9. The method of any of claims 1 to 5, wherein before down-sampling the original image n-1 times to determine the n resolution images, the method further comprises:
and determining the value of n according to the resolution of the original image.
10. A method according to any one of claims 1 to 5, wherein the high frequency information comprises a mean of the sum of the absolute values of the pixel value differences for each pixel in the corresponding image and for the pixels in the respective neighbourhood.
11. An image processing apparatus comprising a processor configured to,
the method comprises the steps of performing n-1 times of downsampling on an original image to determine n resolution ratio images, wherein the resolution ratio of an ith image is smaller than that of an i-1 th image, i is larger than 1 and smaller than or equal to n, and the 1 st image is the original image;
performing edge-preserving filtering on each image to obtain high-frequency information and a filtering result;
respectively executing the following steps on the 2 nd to the nth images until a 1 st filtering fusion result is obtained:
fusing the high-frequency information of the ith image and the high-frequency information of the (i-1) th image to obtain an (i-1) th high-frequency fusion result, wherein when i is less than n, the (i) th high-frequency fusion result is used as the high-frequency information of the ith image;
and according to the ith-1 high-frequency fusion result, fusing the filtering result of the ith image and the filtering result of the ith-1 image to obtain an ith-1 filtering fusion result, wherein when i is less than n, the ith filtering fusion result is used as the filtering result of the ith image.
12. The apparatus of claim 10, wherein the processor is configured to,
up-sampling the high-frequency information of the ith image to obtain an ith high-frequency up-sampling result;
and according to the i-1 first weight, carrying out weighted summation on the i-th high-frequency up-sampling result and the i-1-th image high-frequency information to obtain an i-1-th high-frequency fusion result.
13. The apparatus of claim 10, wherein the processor is configured to,
up-sampling the filtering result of the ith image to obtain an ith filtering up-sampling result;
determining an i-1 th second weight corresponding to the i-1 th high-frequency fusion result, wherein the i-1 th high-frequency fusion result is positively correlated with the i-1 th second weight;
weighting the ith filtering and up-sampling result according to the ith-1 second weight value, weighting the filtering result of the (i-1) th image according to the difference between the 1 th weight value and the ith-1 second weight value, and obtaining the ith-1 filtering and fusing result according to the weighted summation result.
14. The apparatus of claim 13, wherein the (i-1) th high frequency fusion result is inversely related to the weight of the filtering result of the (i) th image and positively related to the weight of the filtering result of the (i-1) th image.
15. The apparatus of claim 14, wherein the processor is configured to,
and determining the (i-1) th second weight corresponding to the (i-1) th high-frequency fusion result according to the association relation table of the (i-1) th second weight and the (i-1) th high-frequency fusion result.
16. The apparatus of any of claims 11 to 15, wherein the down-sampling comprises at least one of:
gaussian downsampling, mean downsampling, maximum or minimum downsampling and median downsampling.
17. The apparatus of any of claims 11 to 15, wherein the means for upsampling comprises at least one of:
nearest neighbor, bilinear, cubic.
18. The apparatus according to any one of claims 11 to 15, wherein the edge-preserving filtering manner comprises at least one of:
bilateral filtering, guided filtering, weighted least square filtering.
19. The apparatus according to any of claims 11 to 15, wherein the processor is further configured to determine the value of n based on a resolution of the original image.
20. Apparatus according to any of claims 11 to 15, wherein the high frequency information comprises a mean of the sum of the absolute values of the pixel value differences for each pixel in the corresponding image and for the pixels in the respective neighbourhood.
21. A drone, characterized in that it comprises an image processing device according to any one of claims 11 to 20.
CN201880069922.4A 2018-12-18 2018-12-18 Image processing method, image processing device and unmanned aerial vehicle Pending CN111344736A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/121726 WO2020124355A1 (en) 2018-12-18 2018-12-18 Image processing method, image processing device, and unmanned aerial vehicle

Publications (1)

Publication Number Publication Date
CN111344736A true CN111344736A (en) 2020-06-26

Family

ID=71100109

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880069922.4A Pending CN111344736A (en) 2018-12-18 2018-12-18 Image processing method, image processing device and unmanned aerial vehicle

Country Status (2)

Country Link
CN (1) CN111344736A (en)
WO (1) WO2020124355A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113296664A (en) * 2021-05-18 2021-08-24 Oppo广东移动通信有限公司 Screen resolution adjusting method and device, terminal equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080253678A1 (en) * 2007-04-10 2008-10-16 Arcsoft, Inc. Denoise method on image pyramid
CN103778606A (en) * 2014-01-17 2014-05-07 Tcl集团股份有限公司 Image processing method and related devices
US20150093015A1 (en) * 2013-09-26 2015-04-02 Hong Kong Applied Science & Technology Research Institute Company Limited Visual-Experience-Optimized Super-Resolution Frame Generator
CN106204657A (en) * 2016-07-21 2016-12-07 北京邮电大学 Moving target based on gaussian pyramid and wavelet transformation describes method across yardstick
CN108154474A (en) * 2017-12-22 2018-06-12 浙江大华技术股份有限公司 A kind of super-resolution image reconstruction method, device, medium and equipment
CN108205804A (en) * 2016-12-16 2018-06-26 阿里巴巴集团控股有限公司 Image processing method, device and electronic equipment

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8120679B2 (en) * 2008-08-01 2012-02-21 Nikon Corporation Image processing method
CN104008539B (en) * 2014-05-29 2017-02-15 西安理工大学 Image super-resolution rebuilding method based on multiscale geometric analysis
CN106530244B (en) * 2016-10-26 2019-03-19 长安大学 A kind of image enchancing method
CN106611408A (en) * 2016-10-26 2017-05-03 成都易云知科技有限公司 Image fusion method
CN107316274A (en) * 2017-05-10 2017-11-03 重庆邮电大学 A kind of Infrared image reconstruction method that edge is kept

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080253678A1 (en) * 2007-04-10 2008-10-16 Arcsoft, Inc. Denoise method on image pyramid
US20150093015A1 (en) * 2013-09-26 2015-04-02 Hong Kong Applied Science & Technology Research Institute Company Limited Visual-Experience-Optimized Super-Resolution Frame Generator
CN103778606A (en) * 2014-01-17 2014-05-07 Tcl集团股份有限公司 Image processing method and related devices
CN106204657A (en) * 2016-07-21 2016-12-07 北京邮电大学 Moving target based on gaussian pyramid and wavelet transformation describes method across yardstick
CN108205804A (en) * 2016-12-16 2018-06-26 阿里巴巴集团控股有限公司 Image processing method, device and electronic equipment
CN108154474A (en) * 2017-12-22 2018-06-12 浙江大华技术股份有限公司 A kind of super-resolution image reconstruction method, device, medium and equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113296664A (en) * 2021-05-18 2021-08-24 Oppo广东移动通信有限公司 Screen resolution adjusting method and device, terminal equipment and storage medium

Also Published As

Publication number Publication date
WO2020124355A1 (en) 2020-06-25

Similar Documents

Publication Publication Date Title
KR101743861B1 (en) Methods of image fusion for image stabilization
CN107492066B (en) Image processing apparatus and method for performing preprocessing to obtain an image with improved sharpness
EP3620989A1 (en) Information processing method, information processing apparatus, and program
CN108629744B (en) Image enhancement method
US20060139376A1 (en) Content adaptive resizer
CN105631828A (en) Image processing method and device
CN113344821B (en) Image noise reduction method, device, terminal and storage medium
EP3438923B1 (en) Image processing apparatus and image processing method
WO2022016326A1 (en) Image processing method, electronic device, and computer-readable medium
CN108234826B (en) Image processing method and device
CN111031241B (en) Image processing method and device, terminal and computer readable storage medium
CN111833269A (en) Video noise reduction method and device, electronic equipment and computer readable medium
CN113344820B (en) Image processing method and device, computer readable medium and electronic equipment
JP2003509779A (en) Compressed edge adaptive video and image sharpening and scaling method and system
JP2020031422A (en) Image processing method and device
CN111344736A (en) Image processing method, image processing device and unmanned aerial vehicle
KR101615479B1 (en) Method and apparatus for processing super resolution image using adaptive pre/post-filtering
van Zyl Marais et al. Robust defocus blur identification in the context of blind image quality assessment
CN111311498B (en) Image ghost eliminating method and device, storage medium and terminal
WO2021102704A1 (en) Image processing method and apparatus
JP2019160297A (en) Image processing device for reducing stepwise artifact from image signal
CN114596210A (en) Noise estimation method, device, terminal equipment and computer readable storage medium
CN113469889A (en) Image noise reduction method and device
CN111986095A (en) Image processing method and image processing device based on edge extraction
KR101650897B1 (en) Window size zooming method and the apparatus for lower resolution contents

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200626

WD01 Invention patent application deemed withdrawn after publication