CN113888438A - Image processing method, device and storage medium - Google Patents

Image processing method, device and storage medium Download PDF

Info

Publication number
CN113888438A
CN113888438A CN202111204143.0A CN202111204143A CN113888438A CN 113888438 A CN113888438 A CN 113888438A CN 202111204143 A CN202111204143 A CN 202111204143A CN 113888438 A CN113888438 A CN 113888438A
Authority
CN
China
Prior art keywords
image
detail
infrared image
noise reduction
gray value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111204143.0A
Other languages
Chinese (zh)
Inventor
张�浩
潘永友
崔明玉
邵晓力
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Micro Image Software Co ltd
Original Assignee
Hangzhou Micro Image Software Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Micro Image Software Co ltd filed Critical Hangzhou Micro Image Software Co ltd
Priority to CN202111204143.0A priority Critical patent/CN113888438A/en
Publication of CN113888438A publication Critical patent/CN113888438A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20032Median filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the application discloses an image processing method, an image processing device and a storage medium, and belongs to the field of image processing. In the embodiment of the application, the detail information in the first infrared image is enhanced to obtain the detail enhanced image, so that the details of the low temperature difference area are more prominent. And in addition, the reference noise reduction strength of the target noise reduction mode is adjusted according to the ISP gain of the first infrared image, so that the noise reduction strength has better scene adaptability. On the basis, the noise reduction process and the detail enhancement process of the first infrared image are separately carried out, and the detail enhancement image and the noise reduction image are fused, so that the detail loss of the image caused by the noise reduction process is reduced, and the visualization effect of the infrared image is improved.

Description

Image processing method, device and storage medium
Technical Field
The present application relates to the field of image processing, and in particular, to an image processing method, an image processing apparatus, and a storage medium.
Background
The infrared thermal imaging device is one of the main image acquisition devices, and mainly receives thermal radiation emitted from the surface of a detected object through an infrared detector to generate an infrared image containing the detected object, and when the detected object is in a low temperature difference scene, the generated infrared image has a problem of poor visualization effect. Based on this, it is desirable to provide an image processing method to improve the visualization effect of the infrared image.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device and a storage medium, which can solve the problem of poor infrared image visualization effect in a low-temperature difference scene, and the technical scheme is as follows:
in one aspect, an image processing method is provided, and the method includes:
enhancing the detail information of the first infrared image to be processed to obtain a detail enhanced image;
performing noise reduction Processing on the first infrared Image according to an Image Signal Processing (ISP) gain of the first infrared Image to obtain a noise-reduced Image;
and fusing the detail enhanced image and the noise-reduced image to obtain a second infrared image.
Optionally, the enhancing the detail information of the first infrared image to be processed to obtain a detail-enhanced image includes:
determining a plurality of detail pixel points which meet preset conditions in the first infrared image;
acquiring a third infrared image with highlighted details according to the plurality of detail pixel points;
and extracting and enhancing the detail information in the third infrared image to obtain the detail enhanced image.
Optionally, the preset condition is that the gradient value of the pixel point is greater than a first reference threshold, or the gray value of the pixel point is greater than a second reference threshold.
Optionally, the obtaining a third infrared image with highlighted details according to the plurality of detail pixel points includes: and increasing the gray difference between the non-detail pixel points adjacent to the plurality of detail pixel points in the first infrared image and the plurality of detail pixel points to obtain the third infrared image.
Optionally, the increasing gray scale difference between non-detail pixel points adjacent to the detail pixel points in the first infrared image and the detail pixel points to obtain the third infrared image includes:
acquiring a first histogram of the first infrared image;
according to each detail pixel point, increasing the number of the pixel points corresponding to the gray value of the corresponding detail pixel point in the first histogram by a specified value to obtain a second histogram;
according to the sequence of gray values from small to large, the number of pixels corresponding to each gray value in the second histogram is updated to the sum of the number of pixels corresponding to the corresponding gray value and the number of pixels corresponding to the previous gray value of the corresponding gray value, and a third histogram is obtained;
determining a mapping gray value corresponding to each gray value in the first histogram according to the third histogram and the reference gray number;
and replacing the gray value of each pixel point in the first infrared image with the mapping gray value corresponding to the corresponding gray value in the first histogram to obtain the third infrared image.
Optionally, the performing, according to the ISP gain of the first infrared image, noise reduction processing on the first infrared image includes:
acquiring reference noise reduction strength corresponding to a target noise reduction mode;
taking the product of the ISP gain and the reference noise reduction strength as a target noise reduction strength;
and performing noise reduction processing on the first infrared image by adopting the target noise reduction mode according to the target noise reduction strength.
Optionally, the fusing the detail-enhanced image and the noise-reduced image to obtain a second infrared image includes:
and fusing the gray value of each pixel point in the detail enhancement image with the gray value of the pixel point at the corresponding position in the noise reduction image to obtain the second infrared image.
In another aspect, there is provided an image processing apparatus, the apparatus including:
the detail enhancement module is used for enhancing the detail information of the first infrared image to be processed to obtain a detail enhanced image;
the noise reduction module is used for processing ISP gain according to the image signal of the first infrared image to perform noise reduction processing on the first infrared image to obtain a noise reduction image;
and the fusion module is used for fusing the detail enhancement image and the noise reduction image to obtain a second infrared image.
Optionally, the detail enhancement module is mainly configured to:
determining a plurality of detail pixel points which meet preset conditions in the first infrared image;
acquiring a third infrared image with highlighted details according to the plurality of detail pixel points;
and extracting and enhancing the detail information in the third infrared image to obtain the detail enhanced image.
Optionally, the preset condition is that the gradient value of the pixel point is greater than a first reference threshold, or the gray value of the pixel point is greater than a second reference threshold.
Optionally, the detail enhancement module is mainly configured to: and increasing the gray difference between the non-detail pixel points adjacent to the plurality of detail pixel points in the first infrared image and the plurality of detail pixel points to obtain the third infrared image.
Optionally, the detail enhancement module is mainly configured to:
acquiring a first histogram of the first infrared image;
according to each detail pixel point, increasing the number of the pixel points corresponding to the gray value of the corresponding detail pixel point in the first histogram by a specified value to obtain a second histogram;
according to the sequence of gray values from small to large, the number of pixels corresponding to each gray value in the second histogram is updated to the sum of the number of pixels corresponding to the corresponding gray value and the number of pixels corresponding to the previous gray value of the corresponding gray value, and a third histogram is obtained;
determining a mapping gray value corresponding to each gray value in the first histogram according to the third histogram and the reference gray number;
and replacing the gray value of each pixel point in the first infrared image with the mapping gray value corresponding to the corresponding gray value in the first histogram to obtain the third infrared image.
Optionally, the noise reduction module is mainly configured to:
acquiring reference noise reduction strength corresponding to a target noise reduction mode;
taking the product of the ISP gain and the reference noise reduction strength as a target noise reduction strength;
and performing noise reduction processing on the first infrared image by adopting the target noise reduction mode according to the target noise reduction strength.
Optionally, the fusion module is mainly configured to:
and fusing the gray value of each pixel point in the detail enhancement image with the gray value of the pixel point at the corresponding position in the noise reduction image to obtain the second infrared image.
In another aspect, there is provided an image processing apparatus, the apparatus including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor executes the executable instructions in the memory to perform the image processing method described above.
In another aspect, a computer-readable storage medium is provided, in which a computer program is stored, which, when executed by a computer, implements the steps of the image processing method described above.
In another aspect, a computer program product is provided comprising instructions which, when run on a computer, cause the computer to perform the steps of the image processing method described above.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
in the embodiment of the application, the noise reduction processing is performed on the first infrared image according to the ISP gain of the first infrared image, so that more detailed information can be reserved in the processed first infrared image. On the basis, the noise reduction process and the detail enhancement process of the first infrared image are separately carried out, and the detail enhancement image and the noise reduction image are fused, so that compared with the method of carrying out detail enhancement after noise reduction or carrying out noise reduction after detail enhancement, the detail loss caused by the noise reduction process to the image is reduced, and the visualization effect of the infrared image is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a system architecture diagram according to an image processing method provided in an embodiment of the present application;
fig. 2 is a flowchart of an image processing method provided in an embodiment of the present application;
fig. 3 is a flowchart for enhancing detail information of a first infrared image to be processed to obtain a detail enhanced image according to an embodiment of the present application;
fig. 4 is a schematic diagram of determining a detail pixel point through a first infrared image and a gradient map of the first infrared image according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a first schematic diagram provided by an embodiment of the present application;
FIG. 6 is a diagram of a second histogram provided by an embodiment of the present application;
FIG. 7 is a schematic diagram of a third histogram provided in an embodiment of the present application;
FIG. 8 is a schematic diagram of a third IR image provided in accordance with an embodiment of the present application;
fig. 9 is a flowchart of performing noise reduction processing on a first infrared image according to an ISP gain of the first infrared image to obtain a noise-reduced image according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application;
fig. 11 is a schematic structural diagram for an image processing server according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Before explaining the embodiments of the present application in detail, an application scenario of the embodiments of the present application will be described.
The image processing method provided by the embodiment of the application can be used for processing the infrared image acquired by the infrared thermal imaging device in a low-temperature-difference scene, for example, in a sea surface navigation scene, the safety condition of a ship navigation sea area at night can be observed at any time by erecting the infrared thermal imaging device on the ship, and the method provided by the embodiment of the application is used for processing the acquired infrared image of the ship navigation sea area, so that the visual effect of the shot sea area in the infrared image is improved, and the navigation safety of the ship is improved. For another example, in a nondestructive detection scene of building quality, an infrared thermal imaging device may collect an infrared image of a target building to observe whether the building has quality defects such as cracks, hollows, and the like, and the method of the embodiment of the present application may be used to process the obtained infrared image of the target building, so as to improve the visualization effect of the target building in the infrared image, thereby improving the accuracy of the judgment.
It should be noted that the foregoing is only some exemplary application scenarios given in the embodiments of the present application, and does not constitute a limitation on the application scenarios of the image processing method provided in the embodiments of the present application.
Fig. 1 is a system architecture diagram according to an image processing method provided in an embodiment of the present application. As shown in fig. 1, the system includes an image processing apparatus 101 and an infrared thermal imaging apparatus 102. Wherein the infrared thermal imaging apparatus 102 and the image processing apparatus 101 may be connected through a wired network or a wireless network.
The infrared thermal imaging device 102 collects an infrared image containing a detected target, and sends the collected infrared image containing the detected target to the image processing device 101, and correspondingly, the image processing device 101 receives the infrared image of the detected target sent by the infrared thermal imaging device 102, and processes the infrared image of the detected target by using the method provided by the embodiment of the application.
The infrared thermal imaging device 102 may be an infrared thermal imager or other image acquisition devices capable of acquiring infrared images, which is not limited in this application. The image processing apparatus 101 may be a computer apparatus having a data processing function, and for example, may be a server or a server cluster, or a cloud platform providing an image processing service. Alternatively, the image processing apparatus 101 may also be a terminal apparatus, for example, a smart phone, a tablet computer, a notebook computer, a personal computer, and the like, which is not limited in this embodiment.
Optionally, in some possible implementations, the image processing device 101 may also be integrated in the infrared thermal imaging device 102, that is, the image processing device 101 may be an image processing unit included in the infrared thermal imaging device 102. The following embodiment describes an implementation process of the embodiment of the present application with the image processing apparatus 101 as a server.
Next, an image processing method provided in an embodiment of the present application will be described.
Fig. 2 is an image processing method according to an embodiment of the present application. The method can be applied to an image processing device, which is taken as an example for explanation hereinafter, and as shown in fig. 2, the method comprises the following steps:
step 201: and enhancing the detail information of the first infrared image to be processed to obtain a detail enhanced image.
The first infrared image may be an original image acquired by the infrared thermal imaging device without compression processing, for example, the first infrared image is an original grayscale image acquired by the infrared thermal imaging device with a grayscale level of 14 bit.
In this embodiment of the application, the server may process the first infrared image in a histogram equalization manner, so that the detail information in the first infrared image can be more prominent. The server may then extract and enhance the detail information from the processed first infrared image, resulting in a detail-enhanced image.
For example, referring to fig. 3, the server may process the detail information of the first infrared image to be processed through the following step 2011-.
2011: and determining a plurality of detail pixel points which meet preset conditions in the first infrared image.
The preset condition may be that the gradient value of the pixel is greater than a first reference threshold, or may also be that the gray value of the pixel is greater than a second reference threshold.
When the preset condition is that the gradient value of the pixel point is larger than the first reference threshold value, the server firstly obtains the gradient value of each pixel point in the first infrared image, and then takes the pixel point of which the gradient value is larger than the first reference threshold value in the first infrared image as a detail pixel point.
Wherein, the gradient value of the pixel point (x, y) in the first infrared image is: the sum of the gray scale difference absolute value of the gray scale values of the pixel point (x, y) and the neighborhood pixel point (x +1, y) and the gray scale difference absolute value of the gray scale values of the pixel point (x, y) and the neighborhood pixel point (x, y + 1).
It should be noted that the gradient value of the last row of pixel points in the first infrared image is the same as the gradient value of the same row of pixel points in the adjacent previous row; similarly, the gradient value of the pixel point in the last column in the first infrared image is the same as the gradient value of the pixel point in the same row in the adjacent previous column.
For example, a graph a in fig. 4 shows the gray values of the pixel points in the first infrared image, where the gradient value of the pixel point whose gray value at the upper left corner in the graph a is 1 is calculated in the following manner: 1-6| + |1-2| ═ 6, calculate the gradient value of each pixel in a picture of fig. 4 through this calculation mode, obtain the gradient map as shown in B picture of fig. 4 according to the gradient value of each pixel of a picture of fig. 4 that is calculated, wherein, the pixel value of each pixel in the gradient map is the gradient value of the pixel at the corresponding position in a picture of fig. 4. If the first reference threshold is 5, the pixel points corresponding to the points with gradient values of 6 and 7 in the B diagram of fig. 4 in the a diagram of fig. 4 are the detail pixel points in the a diagram of fig. 4, such as the pixel points marked by circles in the a diagram of fig. 4.
In another implementation manner, when the preset condition is that the gray value of the pixel point is greater than the second reference threshold, comparing the gray value of each pixel point in the first infrared image with the second reference threshold, and taking the pixel point with the gray value greater than the second reference threshold as the detail pixel point of the first infrared image.
The second reference threshold may be determined according to a distribution of gray values of pixel points in the first infrared image. For example, the median of the gray values of all the pixel points in the first infrared image may be used as the second reference threshold. Alternatively, the second reference threshold may also be a mode of gray values of all pixel points in the first infrared image. Of course, the second reference threshold may also be preset in other ways, which is not limited in this embodiment of the application.
2012: and acquiring a third infrared image with highlighted details according to the plurality of detail pixel points.
After obtaining the plurality of detail pixel points, the server may remap the gray value of the pixel point in the first infrared image according to the plurality of detail pixel points and the histogram of the first infrared image, so as to widen the gray value range of the pixel point in the first infrared image, thereby obtaining a third infrared image with highlighted details.
Illustratively, the server may weight the number of detail pixel points in a histogram of the first infrared image, and then remap the gray values of the pixel points in the first infrared image according to the weighted histogram, so as to increase the gray difference between non-detail pixel points and a plurality of detail pixel points adjacent to the plurality of detail pixel points in the first infrared image, thereby obtaining a third infrared image.
For example, the server may obtain the third infrared image through the following steps a to E.
Step A: a first histogram of the first infrared image is acquired.
In one implementation, the server first obtains the number of pixel points of each gray value in the first infrared image, and then generates a first histogram of the first infrared image by using the gray values contained in the first infrared image as abscissa and the number of pixel points as ordinate.
For example, the gray scale value of the pixel point in the first infrared image shown in the graph a in fig. 4 is 1 to 7, the server counts the number of the pixel points of each gray scale value in the first infrared image, for example, the number of the pixel points with the gray scale value of 1 is 4, the number of the pixel points with the gray scale value of 2 is 5, and then the gray scale values 1 to 7 are used as abscissa and the number of the pixel points is used as ordinate, so as to obtain the first histogram shown in fig. 5.
And B: and increasing the number of pixels corresponding to the gray value of the corresponding detail pixel in the first histogram by a specified value according to each detail pixel to obtain a second histogram.
The server can obtain the gray value of the 1 st detail pixel point from the 1 st detail pixel point in the plurality of detail pixel points, then search the number of the pixel points corresponding to the gray value in the first histogram according to the gray value of the 1 st detail pixel point, then increase a designated numerical value for the number of the pixel points, update the first histogram according to the number of the pixel points after the designated numerical value is increased, and obtain the updated first histogram. And then, acquiring the gray value of the 2 nd detail pixel point, searching the pixel point number corresponding to the gray value of the 2 nd detail pixel point in the updated first histogram according to the gray value of the 2 nd detail pixel point, increasing a specified numerical value for the pixel point number, updating the updated first histogram again according to the pixel point number after the specified numerical value is increased, and so on until the first histogram is updated for the last time according to the last detail pixel point, and taking the first histogram after the last update as a second histogram.
Optionally, the server may also obtain the number of detail pixel points corresponding to each gray value in the first histogram, and update the first histogram according to the number of detail pixel points corresponding to each gray value and the specified numerical value, so as to obtain the second histogram. Exemplarily, taking a certain gray value in the first histogram as an example, for convenience of description, it is referred to as a first gray value, the server obtains the number of detail pixels corresponding to the first gray value, that is, obtains the detail pixels whose gray value is the first gray value, then calculates the product of the number of the detail pixels corresponding to the first gray value and the specified value to obtain a first value, and then adds the first value to the number of pixels corresponding to the first gray value in the first histogram to update the number of pixels corresponding to the first gray value. For the number of the pixel points corresponding to each of the other gray values in the first histogram, the pixel points can be updated by referring to the method, so that a second histogram is obtained.
For example, for the gray value 1 in the first histogram of fig. 5, the number of detail pixels with the gray value 1 in the a histogram of fig. 4 is 2, and assuming that the assigned value is 3, the calculated first value is 6, and on this basis, the number of pixels with the gray value 1 in the first histogram of fig. 5 is added to the first value, so as to obtain the corresponding updated number of pixels 10 when the gray value is 1. Referring to the above method, according to the detail pixel points shown in the diagram a of fig. 4, the number of the pixel points corresponding to each gray value in the first histogram shown in fig. 5 is updated, so as to obtain the second histogram shown in fig. 6.
And C: and according to the sequence of the gray values from small to large, sequentially updating the number of the pixels corresponding to each gray value in the second histogram to the sum of the number of the pixels corresponding to the corresponding gray value and the number of the pixels corresponding to the previous gray value of the corresponding gray value, so as to obtain a third histogram.
Illustratively, the server calculates the sum of the number of pixels corresponding to the 2 nd gray value and the number of pixels corresponding to the 1 st gray value according to the sequence from small to large of the gray values in the second histogram, and then updates the number of pixels corresponding to the 2 nd gray value to the calculated sum. And then, calculating the sum of the number of the pixels of the 3 rd gray value in the second histogram and the number of the pixels of the updated 2 nd gray value, updating the number of the pixels of the 3 rd gray value into the sum, and so on until the number of the pixels corresponding to the last gray value is updated by the method, and obtaining a third histogram.
For example, in fig. 6, the number of pixels having a gray value of 1 is 10, the number of pixels having a gray value of 2 is 8, the sum of the number of pixels having a gray value of 2 and the number of pixels having a gray value of 1 is calculated to be 18, and at this time, the number of pixels having a gray value of 2 is updated to be 18. In fig. 6, the number of pixels with a gray value of 3 is 5, the sum of the number of pixels with a gray value of 3 and the number of pixels with a gray value of 2 after updating is calculated to obtain 23, at this time, the number of pixels with a gray value of 3 is updated to 23, and so on, so that the third histogram shown in fig. 7 can be obtained.
Step D: and determining a mapping gray value corresponding to each gray value in the first histogram according to the third histogram and the reference gray number.
In one implementation, the server first determines a ratio between the number of pixels corresponding to each gray value in the third histogram and the number of pixels corresponding to the maximum gray value in the third histogram to obtain a probability of each gray value in the third histogram, and then multiplies the obtained probability of each gray value by the reference gray value to obtain a mapping gray value corresponding to each gray value.
The reference gray scale number is determined according to the gray scale of the second infrared image which the user is interested in, that is, the user wants to obtain the second infrared image with the gray scale, and the reference gray scale number is the gray scale number indicated by the corresponding gray scale. For example, when the user wants to obtain the second infrared image with a gray level of 8 bits, the reference gray level number is 255, and when the user wants to obtain the second infrared image with a gray level of 10 bits, the reference gray level number is 1023.
For example, if the number of pixels corresponding to the maximum gray-scale value in fig. 7 is 57, and the number of pixels having a gray-scale value of 1 is 10, the probability of the gray-scale value 1 is 10/57 ═ 0.175, and if the reference gray-scale number is 255, the mapped gray-scale value corresponding to the gray-scale value 1 in the first histogram shown in fig. 5 is 0.175 × 255 ═ 44.
Step E: and replacing the gray value of each pixel point in the first infrared image with the mapping gray value corresponding to the corresponding gray value in the first histogram to obtain a third infrared image.
For example, for any gray value in the first histogram, for example, the first gray value, the server may determine pixel points whose gray values are the first gray value in the first infrared image, and then update the gray values of the pixel points to the mapping gray values corresponding to the first gray value. For each gray value in the first histogram, the gray value of the corresponding pixel point in the first infrared image can be updated by referring to the method, so that a third infrared image is obtained.
Still taking the above example as an example, as can be seen from the above steps, the mapping gray-scale value corresponding to the gray-scale value 1 in the first histogram shown in fig. 5 is 44, at this time, the gray-scale values 1 of all the pixels with the gray-scale value 1 in the graph a of fig. 4 may be replaced by 44, and the third infrared image shown in fig. 8 can be obtained by replacing all the gray-scale values in the first infrared image of fig. 4 by the corresponding mapping gray-scale values by using this method.
Optionally, the server may further determine a plurality of detail pixel points that satisfy a preset condition in the first infrared image, and then, when the number of pixel points of each gray value in the first infrared image is counted to generate a histogram, for any one of the detail pixel points, the counted number of the detail pixel point is not 1 but an assigned value, that is, when the number of pixel points corresponding to the gray value of the detail pixel point is counted, the number of contribution of one detail pixel point in the number of pixel points corresponding to the gray value is not 1 but an assigned value. In this way, the second histogram of the first infrared image can be directly obtained. Then, the server may refer to the implementation manners of the foregoing steps C to E, and obtain a third infrared image with highlighted details based on the second histogram.
2013: and extracting and enhancing the detail information in the third infrared image to obtain a detail enhanced image.
In an implementation manner, the server may extract the detail salient pixel points in the third infrared image by using an edge detection operator, and increase the gray value of each detail salient pixel point to obtain a detail enhanced image.
The edge detection operator may be a Roberts (Roberts) operator, a Sobel (Sobel) operator, a Laplacian (Laplacian) operator, a Canny (Canny) operator, or another operator capable of extracting detail information in the third infrared image, which is not limited in this application.
Taking the laplacian as an example, the server firstly adopts convolution of the laplacian to check pixel points in the third infrared image for convolution operation to obtain an amplitude corresponding to each pixel point, then uses the pixel points with the amplitude absolute values larger than a third reference threshold as detail salient pixel points of the third infrared image, then multiplies the gray value of each detail salient pixel point by a preset weight to increase the gray value of each detail salient pixel point, replaces the original gray value of the corresponding detail salient pixel point by the increased gray value, and then replaces the gray values of other pixel points except the detail salient pixel points in the third infrared image by 0 to obtain a detail enhanced image. The preset weight can be determined according to the visualization degree of the detail information in the first infrared image.
In another implementation manner, the server may also extract the detail salient pixel points in the third infrared image by using a transform domain extraction method, and increase the grayscale value of each detail salient pixel point to obtain a detail enhanced image, where the implementation manner of extracting the detail salient pixel points by using the transform domain extraction method may refer to related technologies, and details are not repeated here in the embodiments of the present application.
After the detail enhanced image is obtained, the server can also perform noise reduction processing on the detail enhanced image, so that the signal to noise ratio of the detail enhanced image is improved.
Illustratively, the server may denoise the detail-enhanced image using a N filter window. The server firstly obtains a central pixel point HF (x, y) at the center of the filter window and gray values of all neighborhood pixel points of the central pixel point HF (x, y), calculates gray difference between the gray value of the central pixel point HF (x, y) and the gray value of each neighborhood pixel point, obtains gray difference of all neighborhood pixel points of the central pixel point HF (x, y), obtains variance of the gray difference of all neighborhood pixel points through the gray difference calculation of all neighborhood pixel points of the central pixel point HF (x, y), and can calculate average value of the gray values of all neighborhood pixel points of the central pixel point HF (x, y). If the variance of the gray level differences of all neighborhood pixels of the central pixel is smaller than the noise discreteness judgment threshold and the average of the gray level values of all neighborhood pixels of the central pixel is larger than the neighborhood uniformity judgment threshold, the central pixel is considered as an isolated pixel, and at the moment, the gray level value of the central pixel can be replaced by the average of the gray level values of all neighborhood pixels of the central pixel. If the variance of the gray level differences of all neighborhood pixels of the central pixel is not less than the noise discreteness judgment threshold or the average of the gray level values of all the neighborhood pixels of the central pixel is not more than the neighborhood uniformity judgment threshold, the central pixel is considered not to be an isolated pixel, and at the moment, the gray level value of the central pixel is kept unchanged.
Optionally, after the service performs noise reduction processing on the detail enhanced image through the N × N filtering window, the service may further perform filtering processing on the noise-reduced detail enhanced image again by using a bilateral filtering method, so as to obtain the edge-protected and noise-reduced detail enhanced image. The implementation manner of performing filtering processing on the detail enhanced image by using the bilateral filtering method may refer to related technologies, and details are not described herein in this embodiment of the application.
Step 202: and carrying out noise reduction processing on the first infrared image according to the ISP gain of the first infrared image to obtain a noise-reduced image.
Referring to fig. 9, in one implementation, the server may perform noise reduction processing on the first infrared image according to the ISP gain of the first infrared image through the following steps 2021 to 2023 to obtain a noise-reduced image.
2021: and acquiring the reference noise reduction strength corresponding to the target noise reduction mode.
In this embodiment, the server may first obtain the reference noise reduction strength corresponding to the target noise reduction mode according to the noise reduction mode to be adopted, that is, the target noise reduction mode.
The target noise reduction method may be a gaussian filtering method, a median filtering method, a mean filtering method, a low-pass filtering method, or other filtering methods capable of performing noise reduction processing on the first infrared image, which is not limited in this application.
It should be noted that, because each noise reduction method adopts different noise reduction principles, each noise reduction method has its corresponding reference noise reduction strength according to its own parameters.
Illustratively, when the first infrared image is denoised by gaussian filtering, the reference denoising strength of the gaussian filter is determined by the window size of the gaussian filter and the standard deviation σ.
2022: and taking the product of the ISP gain of the first infrared image and the reference noise reduction intensity as the target noise reduction intensity.
2023: and according to the target noise reduction intensity, carrying out noise reduction processing on the first infrared image in a target noise reduction mode to obtain a noise reduction image.
Still taking gaussian filtering as an example, the target noise reduction strength of processing the first infrared image by using gaussian filtering is obtained from step 2022, the weight coefficient of each point in the gaussian filtering window is determined according to the target noise reduction strength, the pixel point in the first infrared image corresponding to the center point of the gaussian filtering window is taken as a target pixel point, the gray value of each pixel point in the first infrared image is weighted by multiplying the weight coefficient of each point in the gaussian filtering window by the gray value of the corresponding pixel point in the first infrared image covered by the gaussian filtering window, the gray value after each pixel point is weighted is obtained, the gray values after all the pixel points covered by the filtering window are added to obtain the gaussian fuzzy value of the target pixel point, the gray value of the target pixel point is replaced by the fuzzy value of the target pixel point, and the filtering process of the target pixel point is completed, the method is adopted to sequentially obtain the Gaussian blur values of all the pixel points in the first infrared image, and the noise reduction image is obtained after the Gaussian blur values are used for replacing the gray values of the corresponding pixel points.
In the above exemplary description, the target noise reduction method is a gaussian filtering method, and for the case that the target noise reduction method is another noise reduction method, details of the embodiment of the present application are not repeated herein.
In the embodiment of the application, the server performs adaptive adjustment on the reference noise reduction strength of the target noise reduction mode through the ISP gain of the first infrared image, so that the adjusted noise reduction strength has better scene adaptability. In this case, the noise of the first infrared image is reduced by the adjusted noise reduction intensity, and more detailed information can be retained in the noise-reduced image.
Step 203: and fusing the detail enhanced image and the noise-reduced image to obtain a second infrared image.
Illustratively, the server may fuse the gray value of each pixel point in the detail enhancement image with the gray value of the pixel point at the corresponding position in the noise reduction image, so as to obtain the second infrared image.
Optionally, the server adds the gray value of each pixel point in the detail enhancement image to the gray value of the pixel point at the corresponding position in the noise reduction image, so as to obtain a second infrared image.
It should be noted that the gray value obtained by adding the gray value of each pixel point in the detail enhanced image to the gray value of the pixel point at the corresponding position in the noise-reduced image may exceed the maximum allowable gray value of the second infrared image, and in this case, the obtained gray value may be set as the maximum allowable gray value. For example, when the gray level of the second infrared image is 8bit, the maximum allowable gray value is 255, and in this case, if the gray value obtained by adding the gray value of a certain pixel point in the detail enhancement image to the gray value of the pixel point at the corresponding position in the noise reduction image is greater than 255, the gray value of the certain pixel point in the second infrared image may be set to 255.
In the embodiment of the application, the reference noise reduction strength of the target noise reduction mode is adjusted according to the ISP gain of the first infrared image, so that the noise reduction strength has better scene adaptability. On the basis, the noise reduction process and the detail enhancement process of the first infrared image are separately carried out, and the detail enhancement image and the noise reduction image are fused, so that compared with the method of carrying out detail enhancement after noise reduction or carrying out noise reduction after detail enhancement, the detail loss caused by the noise reduction process to the image is reduced, and the visualization effect of the infrared image is improved.
Secondly, the first infrared image can be an infrared image with a 14bit gray level without being compressed, and compared with an infrared image with a 8bit gray level after being compressed, after the 14bit infrared image is subjected to detail enhancement, the obtained detail enhancement image contains more detail information, so that the visualization effect of the detail information in the infrared image can be further improved.
Finally, in the embodiment of the application, the details of the low temperature difference area can be more prominent by performing the highlighting processing on the detail information in the first infrared image. And then, extracting and enhancing the detail information in the third infrared image obtained by the highlighting processing to obtain a detail enhanced image, so that the extraction difficulty of the detail information is reduced, and meanwhile, the visualization effect of the obtained infrared image can be improved by obtaining the final infrared image through the image enhanced by the detail information.
Next, an image processing apparatus provided in an embodiment of the present application will be described.
Referring to fig. 10, an embodiment of the present application provides an image processing apparatus 1000, which may be implemented in software or hardware as part of a server in the foregoing embodiments, where the apparatus 1000 includes: a detail enhancement module 1001, a noise reduction module 1002, and a fusion module 1003.
The detail enhancing module 1001 is configured to enhance detail information of the first infrared image to be processed to obtain a detail enhanced image.
The first infrared image is an original image which is acquired by an infrared thermal imaging device and is not subjected to compression processing, and the gray level of the first infrared image is 14bit, for example.
The detail enhancement module 1001 is mainly configured to determine a plurality of detail pixel points that satisfy a preset condition in the first infrared image, obtain a third infrared image with highlighted details according to the plurality of detail pixel points, and extract and enhance detail information in the third infrared image to obtain a detail enhanced image.
In the embodiment of the application, the detail information in the first infrared image is subjected to highlighting processing, so that the details of the low temperature difference area can be more highlighted. And then, extracting and enhancing the detail information in the third infrared image obtained by the highlighting processing to obtain a detail enhanced image, so that the extraction difficulty of the detail information is reduced, and meanwhile, the visualization effect of the obtained infrared image can be improved by obtaining the final infrared image through the image enhanced by the detail information.
Illustratively, the detail enhancement module 1001 may obtain the third infrared image by increasing a gray difference between non-detail pixel points adjacent to the plurality of detail pixel points in the first infrared image and the plurality of detail pixel points.
For example, the detail enhancement module 1001 may obtain a first histogram of a first infrared image; according to each detail pixel point, increasing the number of the pixel points corresponding to the gray value of the corresponding detail pixel point in the first histogram by a specified value to obtain a second histogram; according to the sequence of gray values from small to large, the number of pixel points corresponding to each gray value in the second histogram is updated to the sum of the number of pixel points corresponding to the corresponding gray value and the number of pixel points corresponding to the previous gray value of the corresponding gray value, and a third histogram is obtained; determining a mapping gray value corresponding to each gray value in the first histogram according to the third histogram and the reference gray number; and replacing the gray value of each pixel point in the first infrared image with the mapping gray value corresponding to the corresponding gray value in the first histogram to obtain a third infrared image.
The reference gray scale number may be determined according to a gray scale of the second infrared image in which the user is interested, that is, the user wants to obtain the second infrared image with what gray scale, and the reference gray scale number is the gray scale number indicated by the corresponding gray scale. For example, when the user wants to obtain the second infrared image with a gray level of 8 bits, the reference gray level number is 255, and when the user wants to obtain the second infrared image with a gray level of 10 bits, the reference gray level number is 1023.
The noise reduction module 1002 is configured to perform noise reduction processing on the first infrared image according to the image signal processing ISP gain of the first infrared image, so as to obtain a noise-reduced image.
Illustratively, the denoising module 1002 is mainly configured to obtain a reference denoising strength corresponding to a target denoising manner; taking the product of the ISP gain and the reference noise reduction strength as the target noise reduction strength; and performing noise reduction processing on the first infrared image in a target noise reduction mode according to the target noise reduction strength.
The target noise reduction method may be a gaussian filtering method, a median filtering method, a mean filtering method, a low-pass filtering method, or other filtering methods capable of performing noise reduction processing on the first infrared image, which is not limited in this application. It should be noted that, because each noise reduction method adopts different noise reduction principles, each noise reduction method has its corresponding reference noise reduction strength according to its own parameters.
In the embodiment of the application, the reference noise reduction strength of the target noise reduction mode is adaptively adjusted through the ISP gain of the first infrared image, so that the adjusted noise reduction strength has better scene adaptability. In this case, the noise of the first infrared image is reduced by the adjusted noise reduction intensity, and more detailed information can be retained in the noise-reduced image.
And the fusion module 1003 is configured to fuse the detail enhancement image and the noise reduction image to obtain a second infrared image.
Exemplarily, the fusion module 1004 is mainly configured to fuse the gray scale value of each pixel in the detail enhancement image with the gray scale value of the pixel at the corresponding position in the noise reduction image to obtain the second infrared image.
In summary, in the embodiment of the present application, the reference noise reduction strength of the target noise reduction mode is adjusted according to the ISP gain of the first infrared image, so that the noise reduction strength has better scene adaptability. On the basis, the noise reduction process and the detail enhancement process of the first infrared image are separately carried out, and the detail enhancement image and the noise reduction image are fused, so that compared with the method of carrying out detail enhancement after noise reduction or carrying out noise reduction after detail enhancement, the detail loss caused by the noise reduction process to the image is reduced, and the visualization effect of the infrared image is improved.
It should be noted that, when the image processing apparatus provided in the above embodiment performs image processing, only the division of the above functional modules is illustrated, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions. In addition, the image processing apparatus and the image processing method provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments in detail and are not described herein again.
Fig. 11 is a schematic diagram illustrating a server architecture in accordance with an example embodiment. The function of the server image processing in the above embodiment may be implemented by the server shown in fig. 11. The server may be a server in a cluster of background servers. Specifically, the method comprises the following steps:
the server 1100 includes a Central Processing Unit (CPU) 1101, a system Memory 1104 including a Random Access Memory (RAM) 1102 and a Read-Only Memory (ROM) 1103, and a system bus 1105 connecting the system Memory 1104 and the CPU 1101. The server 1100 also includes a basic Input/Output system (I/O system) 1106, which facilitates transfer of information between devices within the computer, and a mass storage device 1107 for storing an operating system 1113, application programs 1114, and other program modules 1115.
The basic input/output system 1106 includes a display 1108 for displaying information and an input device 1109 such as a mouse, keyboard, etc. for user input of information. Wherein the display 1108 and the input device 1109 are connected to the central processing unit 1101 through an input output controller 1110 connected to the system bus 1105. The basic input/output system 1106 may also include an input/output controller 1110 for receiving and processing input from a number of other devices, such as a keyboard, mouse, or electronic stylus. Similarly, input-output controller 1110 also provides output to a display screen, a printer, or other type of output device.
The mass storage device 1107 is connected to the central processing unit 1101 through a mass storage controller (not shown) that is connected to the system bus 1105. The mass storage device 1107 and its associated computer-readable media provide non-volatile storage for the server 1100. That is, the mass storage device 1107 may include a computer-readable medium (not shown) such as a hard disk or CD-ROM (Compact disk Read-Only Memory) drive.
Without loss of generality, computer readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes RAM, ROM, EPROM (Erasable Programmable Read-Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), flash Memory or other solid state Memory device, CD-ROM, DVD (Digital Versatile disk), or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices. Of course, those skilled in the art will appreciate that computer storage media is not limited to the foregoing. The system memory 1104 and mass storage device 1107 described above may be collectively referred to as memory.
The server 1100 may also operate in accordance with various embodiments of the application through remote computers connected to a network, such as the internet. That is, the server 1100 may connect to the network 1112 through the network interface unit 1111 that is coupled to the system bus 1105, or may connect to other types of networks or remote computer systems (not shown) using the network interface unit 1111.
The memory further includes one or more programs, and the one or more programs are stored in the memory and configured to be executed by the CPU. The one or more programs include instructions for performing the image processing method provided by the embodiments of the present application.
Embodiments of the present application also provide a computer-readable storage medium, where instructions executed by a processor of a server enable the server to execute the image processing method provided by the above embodiments. For example, the computer readable storage medium may be a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like. It is noted that the computer-readable storage medium referred to in the embodiments of the present application may be a non-volatile storage medium, in other words, a non-transitory storage medium.
It should be understood that all or part of the steps for implementing the above embodiments may be implemented by software, hardware, firmware or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The computer instructions may be stored in the computer-readable storage medium described above.
That is, in some embodiments, there is also provided a computer program product containing instructions which, when run on a computer, cause the computer to perform the image processing method provided by the above-described embodiments.
The above description should not be taken as limiting the embodiments of the present application, and any modifications, equivalents, improvements, etc. made within the spirit and principle of the embodiments of the present application should be included in the scope of the embodiments of the present application.

Claims (10)

1. An image processing method, characterized in that the method comprises:
enhancing the detail information of the first infrared image to be processed to obtain a detail enhanced image;
processing ISP gain according to the image signal of the first infrared image to perform noise reduction processing on the first infrared image to obtain a noise reduction image;
and fusing the detail enhanced image and the noise-reduced image to obtain a second infrared image.
2. The method according to claim 1, wherein the enhancing the detail information of the first infrared image to be processed to obtain a detail-enhanced image comprises:
determining a plurality of detail pixel points which meet preset conditions in the first infrared image;
acquiring a third infrared image with highlighted details according to the plurality of detail pixel points;
and extracting and enhancing the detail information in the third infrared image to obtain the detail enhanced image.
3. The method of claim 2, wherein the predetermined condition is that the gradient value of the pixel is greater than a first reference threshold, or the gray-level value of the pixel is greater than a second reference threshold.
4. The method according to claim 2, wherein said obtaining a third infrared image with highlighted details according to the plurality of detail pixel points comprises:
and increasing the gray difference between the non-detail pixel points adjacent to the plurality of detail pixel points in the first infrared image and the plurality of detail pixel points to obtain the third infrared image.
5. The method according to claim 4, wherein the increasing the gray scale difference between the non-detail pixel points adjacent to the detail pixel points in the first infrared image and the detail pixel points to obtain the third infrared image comprises:
acquiring a first histogram of the first infrared image;
according to each detail pixel point, increasing the number of the pixel points corresponding to the gray value of the corresponding detail pixel point in the first histogram by a specified value to obtain a second histogram;
according to the sequence of gray values from small to large, the number of pixels corresponding to each gray value in the second histogram is updated to the sum of the number of pixels corresponding to the corresponding gray value and the number of pixels corresponding to the previous gray value of the corresponding gray value, and a third histogram is obtained;
determining a mapping gray value corresponding to each gray value in the first histogram according to the third histogram and the reference gray number;
and replacing the gray value of each pixel point in the first infrared image with the mapping gray value corresponding to the corresponding gray value in the first histogram to obtain the third infrared image.
6. The method according to claim 1, wherein the performing noise reduction processing on the first infrared image according to the image signal processing ISP gain of the first infrared image comprises:
acquiring reference noise reduction strength corresponding to a target noise reduction mode;
taking the product of the ISP gain and the reference noise reduction strength as a target noise reduction strength;
and performing noise reduction processing on the first infrared image by adopting the target noise reduction mode according to the target noise reduction strength.
7. The method according to any one of claims 1 to 6, wherein the fusing the detail-enhanced image with the noise-reduced image to obtain a second infrared image comprises:
and fusing the gray value of each pixel point in the detail enhancement image with the gray value of the pixel point at the corresponding position in the noise reduction image to obtain the second infrared image.
8. An image processing apparatus, characterized in that the apparatus comprises:
the detail enhancement module is used for enhancing the detail information of the first infrared image to be processed to obtain a detail enhanced image;
the noise reduction module is used for processing ISP gain according to the image signal of the first infrared image to perform noise reduction processing on the first infrared image to obtain a noise reduction image;
and the fusion module is used for fusing the detail enhancement image and the noise reduction image to obtain a second infrared image.
9. The apparatus of claim 8,
the detail enhancement module is mainly used for: determining a plurality of detail pixel points which meet preset conditions in the first infrared image; acquiring a third infrared image with highlighted details according to the plurality of detail pixel points; extracting and enhancing detail information in the third infrared image to obtain a detail enhanced image;
the preset condition is that the gradient value of the pixel point is greater than a first reference threshold value, or the gray value of the pixel point is greater than a second reference threshold value;
wherein the detail enhancement module is mainly used for: increasing the gray difference between non-detail pixel points adjacent to the detail pixel points and the detail pixel points in the first infrared image to obtain a third infrared image;
wherein the detail enhancement module is mainly used for: acquiring a first histogram of the first infrared image; according to each detail pixel point, increasing the number of the pixel points corresponding to the gray value of the corresponding detail pixel point in the first histogram by a specified value to obtain a second histogram; according to the sequence of gray values from small to large, the number of pixels corresponding to each gray value in the second histogram is updated to the sum of the number of pixels corresponding to the corresponding gray value and the number of pixels corresponding to the previous gray value of the corresponding gray value, and a third histogram is obtained; determining a mapping gray value corresponding to each gray value in the first histogram according to the third histogram and the reference gray number; replacing the gray value of each pixel point in the first infrared image with a mapping gray value corresponding to the corresponding gray value in the first histogram to obtain a third infrared image;
the noise reduction module is mainly used for: acquiring reference noise reduction strength corresponding to a target noise reduction mode; taking the product of the ISP gain and the reference noise reduction strength as a target noise reduction strength; according to the target noise reduction intensity, performing noise reduction processing on the first infrared image in the target noise reduction mode;
the fusion module is mainly used for: and fusing the gray value of each pixel point in the detail enhancement image with the gray value of the pixel point at the corresponding position in the noise reduction image to obtain the second infrared image.
10. A computer-readable storage medium, in which a computer program is stored, which, when executed by a computer, implements the image processing method according to any one of claims 1 to 7.
CN202111204143.0A 2021-10-15 2021-10-15 Image processing method, device and storage medium Pending CN113888438A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111204143.0A CN113888438A (en) 2021-10-15 2021-10-15 Image processing method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111204143.0A CN113888438A (en) 2021-10-15 2021-10-15 Image processing method, device and storage medium

Publications (1)

Publication Number Publication Date
CN113888438A true CN113888438A (en) 2022-01-04

Family

ID=79003023

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111204143.0A Pending CN113888438A (en) 2021-10-15 2021-10-15 Image processing method, device and storage medium

Country Status (1)

Country Link
CN (1) CN113888438A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116051425A (en) * 2023-03-21 2023-05-02 杭州微影软件有限公司 Infrared image processing method and device, electronic equipment and storage medium
CN116205910A (en) * 2023-04-27 2023-06-02 四川省港奇电子有限公司 Injection molding temperature self-adaptive learning regulation and control system for power adapter

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116051425A (en) * 2023-03-21 2023-05-02 杭州微影软件有限公司 Infrared image processing method and device, electronic equipment and storage medium
CN116205910A (en) * 2023-04-27 2023-06-02 四川省港奇电子有限公司 Injection molding temperature self-adaptive learning regulation and control system for power adapter

Similar Documents

Publication Publication Date Title
US9852353B2 (en) Structure aware image denoising and noise variance estimation
AU2017232186A1 (en) Fast and robust image alignment for burst mode
CN107481271B (en) Stereo matching method, system and mobile terminal
CN113888438A (en) Image processing method, device and storage medium
CN111275040B (en) Positioning method and device, electronic equipment and computer readable storage medium
CN112241976A (en) Method and device for training model
CN109214996B (en) Image processing method and device
CN109035167B (en) Method, device, equipment and medium for processing multiple faces in image
WO2014070489A1 (en) Recursive conditional means image denoising
CN111368717A (en) Sight line determining method and device, electronic equipment and computer readable storage medium
CN113781406B (en) Scratch detection method and device for electronic component and computer equipment
CN111080665B (en) Image frame recognition method, device, equipment and computer storage medium
CN108229583B (en) Method and device for fast template matching based on main direction difference characteristics
CN111445487A (en) Image segmentation method and device, computer equipment and storage medium
CN109360167B (en) Infrared image correction method and device and storage medium
CN112435278B (en) Visual SLAM method and device based on dynamic target detection
CN115619652A (en) Image blind denoising method and device, electronic equipment and storage medium
WO2024016632A1 (en) Bright spot location method, bright spot location apparatus, electronic device and storage medium
CN113438386B (en) Dynamic and static judgment method and device applied to video processing
CN113284075B (en) Image denoising method and device, electronic device and storage medium
CN115564682A (en) Uneven-illumination image enhancement method and system
WO2021189460A1 (en) Image processing method and apparatus, and movable platform
CN114066794A (en) Image processing method, device and equipment and storage medium
JP3897306B2 (en) Method for supporting extraction of change region between geographic images and program capable of supporting extraction of change region between geographic images
CN114025089A (en) Video image acquisition jitter processing method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination