CN112700382B - Image seam elimination method and device and electronic equipment - Google Patents

Image seam elimination method and device and electronic equipment Download PDF

Info

Publication number
CN112700382B
CN112700382B CN202011540592.8A CN202011540592A CN112700382B CN 112700382 B CN112700382 B CN 112700382B CN 202011540592 A CN202011540592 A CN 202011540592A CN 112700382 B CN112700382 B CN 112700382B
Authority
CN
China
Prior art keywords
difference value
image
seam
value
difference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011540592.8A
Other languages
Chinese (zh)
Other versions
CN112700382A (en
Inventor
余瑾
刘俊
焦玉茜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikmicro Sensing Technology Co Ltd
Original Assignee
Hangzhou Hikmicro Sensing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikmicro Sensing Technology Co Ltd filed Critical Hangzhou Hikmicro Sensing Technology Co Ltd
Priority to CN202011540592.8A priority Critical patent/CN112700382B/en
Publication of CN112700382A publication Critical patent/CN112700382A/en
Application granted granted Critical
Publication of CN112700382B publication Critical patent/CN112700382B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20224Image subtraction

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application provides an image seam elimination method, an image seam elimination device and electronic equipment, wherein the method comprises the following steps: carrying out non-uniformity correction on an image acquired by an infrared focal plane detector to obtain a corrected image; the focal plane of the infrared focal plane detector comprises at least two parts, the at least two parts are spliced into the focal plane, and each part is driven by an independent circuit; extracting difference values of pixels at two sides of any joint in the corrected image for the joint in the corrected image; the seam corresponds to a splice location of two adjacent portions of the at least two portions of the focal plane; and eliminating the joint according to the difference value. The method can realize the joint elimination of the infrared image of the infrared spliced detector.

Description

Image seam elimination method and device and electronic equipment
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to a method and an apparatus for eliminating an image seam, and an electronic device.
Background
The uncooled infrared focal plane array is an array formed by infrared sensitive pixels, the infrared sensitive pixels can absorb external infrared radiation and cause temperature rise of the pixels, the temperature rise causes resistance change of a heat sensitive material, and the array can work in an environment with non-absolute zero degree.
The uncooled infrared focal plane detector has difficulty in keeping the response rate of each pixel on the focal plane consistent due to circuit design, manufacturing process, scene radiation and the like, which can cause the problem of image non-uniformity, so that non-uniformity correction is required to improve the imaging quality and definition as a whole.
Currently, when a large-area-array uncooled infrared focal plane detector (such as a 1024 x 1280 million-level large-area-array uncooled infrared focal plane detector) is designed, the whole focal plane array is equally divided into an upper part and a lower part (the detector can be called as an infrared splicing detector), and the two independent circuits are respectively used for driving, so that the response of the upper part and the lower part of the detector is uneven due to the difference of the circuits, and a seam phenomenon occurs on an infrared imaging image.
In addition, besides the influence of circuit design factors, the thermal stability of the whole equipment is in a dense and inseparable relation with the generation of joints, and when the temperature of a substrate drifts, the non-uniformity of the upper part and the lower part can be increased or reduced, and the joint phenomenon in an infrared image can be gradually obvious.
However, most of the research on infrared spliced detectors at present is focused on a splicing process mode and circuit design, the focus of the research on imaging defects of the infrared spliced detectors is low, and no seam elimination method for infrared images of the infrared spliced detectors exists.
Disclosure of Invention
In view of the foregoing, the present application provides an image seam eliminating method, an image seam eliminating device and an electronic device.
Specifically, the application is realized by the following technical scheme:
according to a first aspect of embodiments of the present application, there is provided an image seam elimination method, including:
carrying out non-uniformity correction on an image acquired by an infrared focal plane detector to obtain a corrected image; the focal plane of the infrared focal plane detector comprises at least two parts, the at least two parts are spliced into the focal plane, and each part is driven by an independent circuit;
extracting difference values of pixels at two sides of any joint in the corrected image for the joint in the corrected image; the seam corresponds to a splice location of two adjacent portions of the at least two portions of the focal plane;
and eliminating the joint according to the difference value.
According to a second aspect of embodiments of the present application, there is provided an image seam elimination method, including:
carrying out non-uniformity correction on the image to be processed to obtain a corrected image to be processed;
extracting difference values of pixels at two sides of any joint in the corrected image to be processed;
And eliminating the joint according to the difference value.
According to a third aspect of embodiments of the present application, there is provided an image seam elimination apparatus including:
the correcting unit is used for carrying out non-uniformity correction on the image acquired by the infrared focal plane detector to obtain a corrected image; the focal plane of the infrared focal plane detector comprises at least two parts, the at least two parts are spliced into the focal plane, and each part is driven by an independent circuit;
an extracting unit, configured to extract, for any seam in the corrected image, a difference value of pixels at two sides of the seam in the corrected image; the seam corresponds to a splice location of two adjacent portions of the at least two portions of the focal plane;
and the processing unit is used for eliminating the joint according to the difference value.
According to a fourth aspect of embodiments of the present application, there is provided an image seam elimination apparatus including:
the correction unit is used for carrying out non-uniformity correction on the image to be processed to obtain a corrected image to be processed;
an extracting unit, configured to extract, for any seam in the corrected image to be processed, a difference value of pixels at two sides of the seam in the corrected image to be processed;
And the processing unit is used for eliminating the joint according to the difference value.
According to a fifth aspect of embodiments of the present application, there is provided an electronic device comprising a processor and a machine-readable storage medium storing machine-executable instructions executable by the processor for executing the machine-executable instructions to implement the image seam elimination method of the first or second aspect.
According to a sixth aspect of embodiments of the present application, there is provided a computer-readable storage medium having stored therein a computer program which, when executed by a processor, implements the image seam elimination method of the first or second aspect.
According to a seventh aspect of embodiments of the present application, there is provided a computer program stored on a machine-readable storage medium and which, when executed by a processor, causes the processor to perform the image seam elimination method of the first or second aspect.
According to the image seam elimination method, the focusing plane comprises at least two parts, each part adopts the infrared focal plane detector driven by an independent circuit, the corrected image is obtained by carrying out non-uniformity correction on the image obtained by the infrared focal plane detector, the difference value of pixels at two sides of the seam in the corrected image is extracted for any seam in the corrected image, the seam is eliminated according to the difference value, the elimination of the seam in the image caused by the splicing of the focusing plane is realized, and the imaging effect of the infrared focal plane detector is optimized.
Drawings
FIG. 1 is a flow chart of an image seam elimination method according to an exemplary embodiment of the present application;
FIG. 2 is a flow chart illustrating the elimination of the seam according to the variance value according to an exemplary embodiment of the present application;
FIG. 3 is a flow chart illustrating a method for determining valid and invalid ones of the discrepancy values according to an exemplary embodiment of the present application;
FIG. 4 is a flow chart illustrating the replacement of invalid difference values according to valid difference values according to an exemplary embodiment of the present application;
FIG. 5 is a flow chart illustrating the elimination of the seam according to the replaced variance values according to an exemplary embodiment of the present application;
FIG. 6 is a flow chart of an image seam elimination method according to an exemplary embodiment of the present application;
FIG. 7A is a schematic illustration of an infrared image with seams shown in an exemplary embodiment of the present application;
FIG. 7B is a schematic diagram showing an original image and a target background segmentation effect corresponding to the original image according to an exemplary embodiment of the present application;
FIG. 7C is a schematic illustration of moire artifacts in an infrared image after seam elimination according to an exemplary embodiment of the present application;
FIG. 7D is a schematic diagram illustrating an infrared image seam elimination effect according to an exemplary embodiment of the present application;
FIGS. 7E and 7F are schematic diagrams of images of a joint present shown in exemplary embodiments of the present application;
FIG. 8 is a schematic diagram illustrating a relationship between substrate temperature and probe response in accordance with an exemplary embodiment of the present application;
FIG. 9 is a flowchart of a target background binary image obtained by performing target background segmentation on an image according to an exemplary embodiment of the present disclosure;
fig. 10 is a schematic structural view of an image seam elimination device according to an exemplary embodiment of the present application;
fig. 11 is a schematic structural view of an image seam elimination device according to an exemplary embodiment of the present application;
fig. 12 is a schematic diagram of a hardware structure of an electronic device according to an exemplary embodiment of the present application.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present application as detailed in the accompanying claims.
The terminology used in the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the present application. As used in this application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
In order to better understand the technical solutions provided by the embodiments of the present application and make the above objects, features and advantages of the embodiments of the present application more obvious, the technical solutions in the embodiments of the present application are described in further detail below with reference to the accompanying drawings.
Referring to fig. 1, a flowchart of an image seam eliminating method according to an embodiment of the present application is shown in fig. 1, where the image seam eliminating method may include the following steps:
it should be noted that, the sequence number of each step in the embodiment of the present application does not mean that the execution sequence of each process should be determined by the function and the internal logic of each process, and should not limit the implementation process of the embodiment of the present application in any way.
Step S100, carrying out non-uniformity correction on an image acquired by an infrared focal plane detector, and extracting an original difference value of an image joint, wherein a focal plane of the infrared focal plane detector comprises at least two parts, the at least two parts are spliced into the focal plane, and each part is driven by an independent circuit.
In this embodiment of the present application, the focal plane includes at least two portions, and each portion adopts an infrared focal plane detector driven by an independent circuit, as mentioned above, a large-area array uncooled infrared focal plane detector may have a seam at a splicing position of an acquired image corresponding to the focal plane, which affects an image imaging effect.
In order to optimize the imaging effect of the image acquired by this type of infrared focal plane detector, it is necessary to perform a seam elimination process on the image acquired by this type of infrared focal plane detector.
Considering that the infrared focal plane detector is difficult to keep consistent in response rate of each pixel on a focal plane due to circuit design, manufacturing process, scene radiation and the like, so that the problem of image non-uniformity is caused, the non-uniformity correction is required to be carried out on the image acquired by the infrared focal plane detector, and the imaging quality and definition are improved as a whole.
For example, the non-uniformity correction of the image acquired by the infrared focal plane detector can be realized by sequentially performing two-point correction, dead pixel beating, time-domain filtering and the like on the image acquired by the infrared focal plane detector.
Step S110, extracting difference values of pixels at two sides of any joint in the corrected image for the joint in the corrected image; the seam in the image corresponds to a splice location of two adjacent portions of the at least two portions of the focal plane.
In this embodiment of the present application, for any seam in the corrected image, the difference value of the pixels at two sides of the seam in the corrected image may be extracted.
For example, assuming that there is a seam (|i1-i2|=1) between the i1 st line and the i2 nd line of the corrected image, the difference value of the seam is a difference value between the pixel values of each pixel point in the i1 st line and the pixel points in the same column in the i2 nd line, for example, a difference value … between the pixel values of the pixel points in the 1 st column in the i1 st line and the pixel values of the pixel points in the 1 st column in the i2 th line and the pixel values of the pixel points in the m 1 st line and the m column in the i2 nd line.
Step S120, eliminating the joint according to the extracted difference value.
In this embodiment of the present application, for any seam in the corrected image, the seam may be eliminated according to the extracted difference values of the pixels at two sides of the seam.
For example, for a side with a smaller pixel value in an image on both sides of a seam, a difference value corresponding to the pixel of each column on the side may be added to the pixel value of the column; alternatively, for the side with higher pixel values in the image on both sides of the seam, the pixel value of each column on the side may be subtracted by the difference value corresponding to the column of pixels, and the specific implementation thereof will be described below with reference to specific examples.
In the embodiment of the present application, the infrared focal plane detector whose focal plane includes 3 or more portions may have a plurality of seams in the acquired image, and when each seam is eliminated, it is necessary to ensure that a uniform reference is used for each portion of the image (the seam in the image divides the image into a plurality of portions).
For example, taking the image shown in fig. 7E as an example, assuming that 2 seams (the focal plane includes 3 portions) exist in the image, when seam elimination is performed, the difference values of pixels at two sides of each seam can be determined respectively in the above manner, when seam elimination is performed according to the difference values, the middle partial image can be used as a reference standard to compensate the upper and lower partial images, that is, the pixel values of the middle partial image are kept unchanged, the pixel values of the upper and lower partial images are adjusted, the coordination of the whole image is ensured, and the image imaging effect is optimized.
By way of further example, consider the image shown in fig. 7F where there are 4 seams in the image that are "cross-shaped" (the focal plane includes 4 portions).
When the seam is eliminated, the difference values of the pixels at the two sides of the seam 1 and the difference values of the pixels at the two sides of the seam 2 can be respectively determined in the above manner, and the image at the 2 nd part (the upper right corner) and the image at the 3 rd part (the lower left corner) are respectively compensated by taking the image at the 1 st part (the upper left corner) as a reference according to the determined difference values; then, the difference values of the pixels at both sides of the joint 3 are determined again, and the 4 th partial image (lower right corner) is compensated with the 2 nd partial image as a reference, or the difference values of the pixels at both sides of the joint 4 are determined, and the 4 th partial image is compensated with the 3 rd partial image as a reference, according to the determined difference values.
Alternatively, when seam elimination is performed, the difference values of the pixels on both sides of the seam 1 and the difference values of the pixels on both sides of the seam 4 may be determined in the above manner, and the part 2 image may be compensated with the part 1 image as a reference and the part 4 image as a reference according to the determined difference values; at this time, the 1 st partial image and the 2 nd partial image may be the same partial image (e.g., referred to as an upper partial image), the 3 rd partial image and the 4 th partial image may be the same partial image (e.g., referred to as a lower partial image), and then, difference values of pixels at both sides of the seam 2 and the seam 3 may be determined in the above manner, and the lower partial image may be compensated with reference to the upper partial image or the upper partial image may be compensated with reference to the lower partial image, respectively, according to the determined difference values.
It can be seen that in the method flow shown in fig. 1, the focusing plane includes images acquired by the multi-part infrared focal plane detector, and after the non-uniformity correction is performed, the seams are eliminated according to the difference values of the pixel values at two sides of the seams in the corrected images, so that the image imaging effect is optimized.
In some embodiments, as shown in fig. 2, in step S120, the seam is eliminated according to the difference value, which may be achieved by:
step S121, performing object and background segmentation on the corrected image, and determining an effective difference value and an ineffective difference value in the difference values according to the segmentation result.
Step S122, replacing the invalid difference value according to the valid difference value to obtain a replaced difference value.
And step S123, eliminating the joint according to the replaced difference value.
By way of example, considering that the above image seam elimination scheme is implemented by compensating an image according to the difference between pixel values of pixel points at two sides of a seam, when there is a small-sized over-bright target at the center of a scene or when there are just two different objects with larger brightness differences on the seam of the scene, compensation according to the difference values of pixels at two sides of the seam affects other background portions, and bright or dark wide vertical lines appear, therefore, the image is subject to target background separation, and seam elimination is performed according to the pixel differences at the seam of the background portion, so that the effect of seam elimination can be optimized.
Accordingly, it is possible to perform object and background segmentation on the corrected image and determine the effective and ineffective difference values among the difference values determined in step S110 according to the segmentation result.
For example, the maximum inter-class variance method (OTSU) may be used to achieve target background segmentation of the image, and the specific implementation thereof may be described below with reference to specific examples, which are not described herein.
In one example, for any one of the difference values of the pixels at both sides of the seam extracted in step S110, when two pixel points at both sides of the seam corresponding to the difference value are both backgrounds, the difference value is a valid difference value; otherwise, the difference value is an invalid difference value.
In one example, as shown in fig. 3, in step S121, the corrected image is subjected to object and background segmentation, and the valid difference value and the invalid difference value in the difference values are determined according to the segmentation result, which may be achieved by:
step S1211, performing object and background segmentation on the corrected image, so as to obtain an object background binary image.
Step S1212, for any difference value, when the values of two pixel points corresponding to the difference value in the target background binary image are both 1, determining the difference value as an effective difference value; otherwise, the difference value is determined to be an invalid difference value.
By way of example, the target background binary image can be obtained by performing target and background segmentation on the corrected image.
In the target background binary image, a pixel at the background (hereinafter referred to as a first type pixel) corresponds to a value of 1, and a pixel at the target (hereinafter referred to as a second type pixel) corresponds to a value of 0.
For any difference value, when the value of two pixel points (two adjacent pixel points in the same column on two sides of a joint) corresponding to the difference value in the target background binary image is 1, that is, the two pixel points are all backgrounds, determining the difference value as an effective difference value; otherwise, the difference value is determined to be an invalid difference value.
For example, when the value of any one of the two pixels in the target background binary image is 0 (the pixel is the target), or when the values of the two pixels in the target background binary image are both 0 (the two pixels are both the target), the difference value is determined to be an invalid difference value.
When the effective difference value and the ineffective difference value are determined, the ineffective difference value can be replaced according to the effective difference value to obtain a replaced difference value, and the joint is eliminated according to the replaced difference value, so that wide vertical lines in the image after joint elimination are avoided, and the imaging effect of the image is further optimized.
In one example, as shown in fig. 4, in step S122, the replacement of the invalid difference value according to the valid difference value may be achieved by:
Step S1221, determining an average value of the effective difference values.
Step S1222, replacing the invalid difference value with the average value of the valid difference values.
For example, when the effective difference value and the ineffective difference value among the difference values are determined in the above-described manner, the average value of each effective difference value may be determined and the ineffective difference value may be replaced with the average value of the effective difference values.
For example, assuming that the difference values extracted in step S110 include dif1, dif2, …, and dif1280, where dif1, dif2, … dif1200 are valid difference values and dif1201, dif1202, …, and dif1280 are invalid difference values, the average value of the valid difference values may be determined to be:
dif_ave=(dif1+dif2+…+dif1200)/1200
further, any one of the invalid difference values such as dif1201, dif1202, …, dif1280 may be replaced with dif_ave.
In one example, as shown in fig. 5, in step S123, the seam is eliminated according to the replaced difference value, which may be achieved by:
step S1231, performing smoothing filtering on the replaced difference value to obtain a filtered difference value.
Step S1232, the seam is eliminated according to the filtered difference value.
For example, considering that if the effective difference value is simply used to replace the ineffective difference value, the phenomenon that fine vertical lines exist in the image after the seam is eliminated due to the discontinuity of the difference compensation value may affect the imaging quality of the image, after the difference value is replaced in the above manner, the replaced difference value may be smoothly filtered to improve the continuity of the difference compensation value, and further, the imaging effect of the image after the seam is eliminated may be optimized.
Accordingly, when the replaced difference value is obtained in the manner described in step S122, the replaced difference value may be smoothed to obtain a filtered difference value, and the seam is eliminated according to the filtered difference value, so as to avoid occurrence of fine vertical lines in the image after seam elimination, and further optimize the image imaging effect.
For example, the moving smoothing filtering may be performed on the replaced difference value according to a preset filtering window, and the specific implementation of the moving smoothing filtering may be described below in conjunction with a specific example, which is not described herein.
When the difference value is subjected to the moving smoothing filtering, a proper window size needs to be selected, if the window is too small, the difference compensation value is discontinuous, and a serious fine vertical line phenomenon can occur; if the window is too large, after seam elimination is performed based on the final difference value, the image is layered locally, and the seam elimination effect is affected.
In one example, the window size may be 1 x 55 for 1024 x 1280 images.
In some embodiments, the difference value is an n-element one-dimensional vector, n is the number of columns of the image acquired in step S100, the j-th value in the difference values of the pixels at both sides of the seam extracted in step S110 is I (I1, j) minus I (I2, j), I (I1, j) is the pixel value of the pixel point of the j-th column of the I1-th row of the corrected image, I (I2, j) is the pixel value of the pixel point of the j-th column of the I2-th row of the corrected image, the I1-th row is attributed to the first portion of the corrected image, the I2 row is attributed to the second portion of the corrected image, the seam of the image is the seam between the first portion and the second portion, and |i1-i2|=1.
In step S120, the seam elimination according to the difference value may include:
for any pixel point in the j-th column in the second part of the corrected image, adding the pixel value of the pixel point to the j-th value in the final difference value; or alternatively, the first and second heat exchangers may be,
for any pixel point in the j-th column in the first part of the corrected image, subtracting the j-th value in the final difference value from the pixel value of the pixel point.
For example, when the difference value is obtained by subtracting the pixel value of the pixel point in the second portion from the pixel value of the pixel point in the first portion on both sides of the joint, when the difference value is determined in the above manner, the pixel value of the pixel point in the first portion image or the pixel value of the pixel point in the second portion image may be compensated according to the difference value.
For example, the pixel value of the pixel point of any column of the second portion is added to the difference value of the corresponding column, or the pixel value of the pixel point of any column of the first portion is subtracted from the difference value of the corresponding column.
When the difference value is determined, the pixel value of the pixel point in the first partial image and the pixel value of the pixel point in the second partial image may be compensated.
For example, assuming that the jth value (corresponding to the jth column of pixels of the image) in the difference value is 10, the pixel value of any pixel point in the jth column in the first portion may be subtracted by x, and then any pixel point in the jth column in the second portion may be added by (10-x); x is more than 0 and less than 10.
Referring to fig. 6, a flowchart of an image seam eliminating method according to an embodiment of the present application is shown in fig. 6, where the image seam eliminating method may include the following steps:
and step S600, carrying out non-uniformity correction on the image to be processed to obtain a corrected image to be processed.
The image to be processed is not specifically limited to a fixed image, but may refer to any image that needs to be subjected to seam elimination, for example, an image obtained by the infrared focal plane detector described in the above method embodiment.
Illustratively, the image to be processed is an infrared image to be processed.
In this embodiment of the present application, the implementation of step S600 may be referred to the related description in the above embodiment, which is not repeated herein.
Step S610, for any seam in the corrected image to be processed, extracting the difference values of the pixels at two sides of the seam in the corrected image to be processed.
Step S620, eliminating the joint according to the difference value.
In this embodiment, the implementation of step S610 to step S620 may be referred to the related description in the above embodiment, which is not repeated herein.
In some embodiments, in step S620, the seam is eliminated according to the difference value, which may include:
performing target and background segmentation on the corrected image to be processed, and determining an effective difference value and an ineffective difference value in the difference values according to segmentation results;
replacing the invalid difference value according to the valid difference value to obtain a replaced difference value;
and eliminating the joint according to the replaced difference value.
By way of example, considering that the image seam elimination scheme is implemented by compensating an image according to the difference between pixel values of pixel points at two sides of the seam, when there is a small-sized over-bright target in the center of the scene or when there are exactly two different objects with larger brightness differences on the scene seam, the compensation according to the difference values of pixels at two sides of the seam affects other background portions, and bright or dark wide vertical lines appear, thus, the image is subject to target background separation, and seam elimination is performed according to the pixel differences at the seam of the background portion, so that the effect of seam elimination can be optimized.
Accordingly, it is possible to perform object and background segmentation on the corrected image to be processed, and determine the valid and invalid difference values among the difference values determined in step S610 according to the segmentation result.
For example, the objective background segmentation of the corrected image to be processed may be implemented by using the maximum inter-class variance method (OTSU for short), and the specific implementation thereof may be described below with reference to specific examples, which are not described herein.
In one example, for any one of the difference values of the pixels at both sides of the seam extracted in step S610, when two pixel points at both sides of the seam corresponding to the difference value are both backgrounds, the difference value is a valid difference value; otherwise, the difference value is an invalid difference value.
In one example, the target-to-background segmentation of the image to be processed and determining the valid difference value and the invalid difference value in the difference values according to the segmentation result may include:
performing target and background segmentation on the corrected image to obtain a target background binary image;
for any difference value, when the values of two pixel points corresponding to the difference value in the target background binary image are 1, determining the difference value as an effective difference value; otherwise, the difference value is determined to be an invalid difference value.
By way of example, the target background binary image can be obtained by performing target and background segmentation on the corrected image to be processed.
In the target background binary image, a pixel at the background (hereinafter referred to as a first type pixel) corresponds to a value of 1, and a pixel at the target (hereinafter referred to as a second type pixel) corresponds to a value of 0.
For any difference value, when the value of two pixel points (two adjacent pixel points in the same column on two sides of a joint) corresponding to the difference value in the target background binary image is 1, that is, the two pixel points are all backgrounds, determining the difference value as an effective difference value; otherwise, the difference value is determined to be an invalid difference value.
For example, when the value of any one of the two pixels in the target background binary image is 0 (the pixel is the target), or when the values of the two pixels in the target background binary image are both 0 (the two pixels are both the target), the difference value is determined to be an invalid difference value.
When the effective difference value and the ineffective difference value are determined, the ineffective difference value can be replaced according to the effective difference value to obtain a replaced difference value, and the joint is eliminated according to the replaced difference value, so that wide vertical lines in the image after joint elimination are avoided, and the imaging effect of the image is further optimized.
In one example, the replacing the invalid difference value according to the valid difference value may include:
determining an average value of the effective difference values;
the invalid difference value is replaced by the average of the valid difference values.
For example, when the effective difference value and the ineffective difference value among the difference values are determined in the above-described manner, the average value of each effective difference value may be determined and the ineffective difference value may be replaced with the average value of the effective difference values.
In one example, the removing the seam according to the replaced difference value may include:
smoothing and filtering the replaced difference value to obtain a filtered difference value;
and eliminating the joint according to the filtered difference value.
For example, considering that if the effective difference value is simply used to replace the ineffective difference value, the phenomenon that fine vertical lines exist in the image after the seam is eliminated due to the discontinuity of the difference compensation value may affect the imaging quality of the image, after the difference value is replaced in the above manner, the replaced difference value may be smoothly filtered to improve the continuity of the difference compensation value, and further, the imaging effect of the image after the seam is eliminated may be optimized.
Accordingly, when the replaced difference value is obtained, the replaced difference value can be subjected to smooth filtering to obtain a filtered difference value, and the seam is eliminated according to the filtered difference value, so that fine vertical lines in the image after seam elimination are avoided, and the image imaging effect is further optimized.
For example, the moving smoothing filtering may be performed on the replaced difference value according to a preset filtering window, and the specific implementation of the moving smoothing filtering may be described below in conjunction with a specific example, which is not described herein.
When the difference value is subjected to the moving smoothing filtering, a proper window size needs to be selected, if the window is too small, the difference compensation value is discontinuous, and a serious fine vertical line phenomenon can occur; if the window is too large, after seam elimination is performed based on the final difference value, the image is layered locally, and the seam elimination effect is affected.
In one example, the window size may be 1 x 55 for 1024 x 1280 images.
In some embodiments, the difference value is an n-element one-dimensional vector, n is the number of columns of the image to be processed, the j-th value in the difference values of the pixels at two sides of the seam extracted in step S610 is I (I1, j) minus I (I2, j), I (I1, j) is the pixel value of the pixel point of the j-th column of the I1-th row of the corrected image to be processed, I (I2, j) is the pixel value of the pixel point of the j-th column of the I2-th row of the corrected image to be processed, the I1-th row is attributed to the first portion of the corrected image to be processed, the I2-th row is attributed to the second portion of the corrected image to be processed, the seam is the seam between the first portion and the second portion, and |i1—i2|=1.
In step S620, the seam elimination according to the difference value may include:
for any pixel point of the j-th column in the second part of the corrected image to be processed, adding the pixel value of the pixel point with the j-th value in the final difference value; or alternatively, the first and second heat exchangers may be,
for any pixel point of the j-th column in the first part of the corrected image to be processed, subtracting the j-th value in the final difference value from the pixel value of the pixel point.
For example, when the difference value is obtained by subtracting the pixel value of the pixel point in the second portion from the pixel value of the pixel point in the first portion on both sides of the joint, when the difference value is determined in the above manner, the pixel value of the pixel point in the first portion image or the pixel value of the pixel point in the second portion image may be compensated according to the difference value.
For example, the pixel value of the pixel point of any column of the second portion is added to the difference value of the corresponding column, or the pixel value of the pixel point of any column of the first portion is subtracted from the difference value of the corresponding column.
When the difference value is determined, the pixel value of the pixel point in the first partial image and the pixel value of the pixel point in the second partial image may be compensated.
For example, assuming that the jth value (corresponding to the jth column of pixels of the image) in the difference value is 10, the pixel value of any pixel point in the jth column in the first portion may be subtracted by x, and then any pixel point in the jth column in the second portion may be added by (10-x); x is more than 0 and less than 10.
In order to enable those skilled in the art to better understand the technical solutions provided by the embodiments of the present application, the technical solutions provided by the embodiments of the present application are described below with reference to specific examples.
In this embodiment, infrared image seam elimination for 1024 x 1280 megaplane uncooled infrared focal plane detectors (infrared stitching detectors for short) is taken as an example.
After two-point non-uniformity correction, the infrared splice detector has non-uniformity similar to a "seam" on the imaging map, and the schematic diagram can be shown in fig. 7A. The reason for this phenomenon is mainly the following two points:
firstly, when the infrared spliced detector is designed into a circuit, the whole focal plane array is equally divided into an upper part and a lower part which are 512 multiplied by 1280, and the upper part and the lower part are respectively driven by two independent circuits, so that the response of the upper part and the lower part of the detector is non-uniform due to the difference of the circuits, and the seam phenomenon appears on an infrared imaging chart.
Secondly, besides the influence of circuit design factors, the thermal stability of the whole equipment has a dense and inseparable relation with the generation of joints, and when the temperature of a substrate drifts, the non-uniformity of the upper part and the lower part can be increased or reduced, and the joint phenomenon in an infrared image can be gradually obvious.
For example, a schematic diagram of the relationship between substrate temperature and detector response may be as shown in FIG. 8. Wherein up and down represent the pixel response curves of the upper and lower detector halves, respectively. The response curves have different trends as the substrate temperature changes, resulting in larger and larger differences, and the correction amount C required to eliminate the seams also changes with time, temperature, environment, etc.
The application provides a seam elimination scheme of infrared splice detector, solves the inconsistent phenomenon of response brought by the temperature drift effect by solving the effective difference value of the upper part and the lower part, and corrects the integral difference of images.
The scheme mainly comprises the processing of target background segmentation, difference extraction, compensation and the like, and the main flow comprises the following steps: firstly, carrying out non-uniformity correction on detector data (infrared images), and extracting an original difference value; then, carrying out target background segmentation on the corrected image to obtain a target background binary image so as to determine an invalid difference value, eliminating the invalid difference value, and replacing the invalid difference value by using an effective difference value mean value; performing mobile smooth filtering on the effective difference value to remove other noise influences such as vertical lines; and finally, compensating the original data according to the smoothed difference value, eliminating the interference of the over-bright and over-dark objects by screening the difference value, eliminating the splice seam by the compensated data, and improving the integral visual effect of the image.
The following describes the above-mentioned flow in detail:
1. object and background segmentation
Considering that when there is a small-sized over-bright target in the center of the scene or when there is just two different objects with larger brightness difference on the scene joint, compensation is performed according to the difference between the pixel values of the pixel points at the joint of the part, so that other background parts are affected, bright or dark wide vertical lines appear, and therefore, the part needs to be extracted and does not participate in subsequent compensation value calculation.
For example, the maximum inter-class variance method (OTSU) may be used to segment the object and the background, and the algorithm is independent of the brightness and contrast of the image, and divides the image into two parts, namely the background and the object, according to the gray scale characteristics of the image.
The gray pixels of the image histogram are divided into background and target categories according to a certain threshold value, the inter-category variance of the two categories is calculated, and the T which enables the inter-category variance to be maximum is found through continuous iteration of the threshold value T. As the variance is a measure of the uniformity of gray level distribution, the larger the inter-class variance between the background and the object is, which means that the larger the difference between two parts forming the image is, the variance between the two parts becomes smaller when a part of the object is divided into the background by mistake or the part of the object is divided into the object by mistake, so that the threshold T corresponding to the maximum inter-class variance is the optimal threshold for dividing the object and the background of the image.
For example, for image I (x, y), assuming the image size is m×n (in this embodiment, m×n may be 1204×1280), the segmentation threshold of the object and the background is denoted as T.
Pixels in the image, the gray value of which is less than or equal to the threshold value T, belong to the background, and the number is recorded as N 0 Average gray scale is mu 0 . Pixels whose gray value is greater than the threshold value T belong to the target, and the number is recorded as N 1 Average gray scale is mu 1 . The background and target account for the following:
let the total average gray μ of the image, then the inter-class variance g is:
g=w 0 ×(μ 0 -μ) 2 +w 1 ×(μ 1 -μ) 2
whereas the total gray level of the image μ=ω 0 ×μ 01 ×μ 1 Combining the above formulas, yields:
g=w 0 ×w 1 ×(μ 0 -μ) 2
based on the initialized threshold T, an optimal threshold T for optimal target background segmentation is obtained by utilizing threshold iteration, and images are subjected to two classification based on the optimal threshold T, so that a target background binary image is obtained:
the pixel value is less than or equal to T and is used as a background, and the subsequent operation is participated in, and is set as 1; the pixel value is greater than T, which is set to 0 without participating in the calculation of the effective difference.
For example, a flowchart of the target background segmentation of the image to obtain a target background binary image may be shown in fig. 9.
As shown in fig. 9, the implementation flow for obtaining the target background binary image by performing target background segmentation on the image may include:
S901, determining a histogram hist (i) of the corrected image, and calculating an image mean (mean) from the histogram hist (i).
Illustratively, i=1, 2,3, …, nbins.
Nmins is the maximum value of the raw data (i.e., the maximum pixel value in the raw image, which may also be referred to as gray scale).
S902, initializing thresholdInter-class variance g 0 =0。
For example, when the image mean is determined in the manner of step 1, the threshold T may be initialized toI.e. the initial value of the threshold T is +.>To round down and initialize the inter-class variance, an initial value (noted g 0 ) The value of (2) is 0.
S903, dividing the corrected image into a target and a background according to a threshold T, and respectively determining the duty ratio of the target and the background in the image.
For example, when the threshold T is determined, a pixel whose gray value is smaller than the threshold T in the image may be determined as a background; pixels having gray values of the pixels greater than the threshold T are determined as targets.
For example, for image I (x, y), the size of the image is assumed to be m×n (in this embodiment, m×n may be 1204×1280).
Pixels in the image, the gray value of which is less than or equal to a threshold value T (i.e. I is less than or equal to T), belong to the background, and the number is recorded as N 0 Average gray scale (which may be referred to as background mean) is μ 0 . Pixels whose gray value is greater than the threshold value T (i.e., I > T) belong to the target, the number being denoted N 1 Average gray scale (which may be referred to as target average) is μ 1 . The background duty cycle (noted ω 0 ) And a target duty cycle (noted omega 1 ) The following are provided:
exemplary, omega 01 =1, i.e. ω 1 =1-ω 0
S904, determining the inter-class variance (g) according to the duty ratio of the target and the background.
Illustratively, assuming the total average gray μ of the image, the inter-class variance g is:
g=w 0 ×(μ 0 -μ) 2 +w 1 ×(μ 1 -μ) 2
whereas the total gray level of the image μ=ω 0 ×μ 01 ×μ 1 Combining the above formulas, yields:
g=w 0 ×w 1 ×(μ 0 -μ) 2
s905, determining whether the determined inter-class variance is larger than a preset initial value of the inter-class variance. If yes, go to step S906; otherwise, go to step S907.
S906, updating the value of the inter-class variance to the determined inter-class variance (i.e. let g 0 =g), and proceeds to step S907.
S907, iterating the threshold (adding 1 on the basis of the current threshold, i.e. let t=t+1), and proceeding to step S903.
Illustratively, assume that the initialization inter-class variance is g 0 =0, if the inter-class variance g > g determined in step S904 0 Will g 0 And updates the value of the currently used threshold value T to t+1, and re-executes steps S903 to S905.
S908, determining whether T is greater than or equal to Nmins; if yes, go to step S909; otherwise, go to step S907.
S909, setting the background as 1, setting the target as 0, and completing the target background segmentation by using the target background binary image.
For example, the target background segmentation effect may be as shown in fig. 7B.
It should be noted that, the histogram hist (i) counted in the target background segmentation process may be multiplexed in the subsequent histogram equalization.
2. Original difference extraction
dif=I(512,:)-I(513,:)
dif is the original difference value, and I (x:) represents the original pixel value of the x-th row of pixel points.
For an infrared stitching detector of 1024 x 1280, the seam of the acquired infrared image is the seam between the 512 th row and the 513 th row, and the initial difference value can be determined by the pixel value difference of the pixels of the two rows.
3. Effective variance value determination
Obtaining effective zone bits according to the bin in the target background binary image in the step 1:
namely, when the pixel points on the upper and lower sides of the joint are all the backgrounds, the difference values corresponding to the two pixel points are effective difference values; otherwise, the value is an invalid difference value; wherein, a flag of 1 represents an effective difference value and a flag of 0 represents an ineffective difference value.
Determining the mean value of the effective difference values:
i.e. the ratio of the sum of the effective difference values to the number of effective difference values is determined as the average value of the effective difference values.
The difference value after the replacement of the effective difference value is completed is:
The difference is filtered to avoid the appearance of corrected wide vertical streaks caused by bright objects, as shown in fig. 7C (a).
4. Smoothing and filtering the difference value after the replacement
Taking the motion smoothing filtering as an example, assume that the difference value after the completion of the replacement is x= { x 1 ,x 2 ,x 3 ,…x n A length of n, and a difference value (i.e., a final difference value) of the smooth filtered output of y= { y 1 ,y 2 ,y 3 ,…y n If the filter window is taken to be 1×w (W is an odd number), let p= (W-1)/2:
it should be noted that, if the filtering window is too small, the difference compensation value is discontinuous, and a serious fine vertical streak phenomenon occurs, as shown in (b) in fig. 7C; if the window is too large, after seam elimination is performed based on the final difference value, the image is layered locally, and the seam elimination effect is affected.
Illustratively, taking window 1×55, the smoothed filtered final difference value (which may also be referred to as the compensation value) dif' is calculated.
5. Compensation of raw image data
The pixel values of each pixel point in the image after the joint is eliminated are as follows:
wherein I is up The original pixel value of the pixel point in the upper half of the image is I down Is the original pixel value of the pixel point of the lower half part of the image, out up To eliminate the pixel value of the upper half pixel point of the seamed image, out down To eliminate the pixel value of the pixel point at the lower half part of the image after the joint.
For each column of pixel points of the lower half image, the pixel value of the pixel point may be added with the compensation value of the column.
Since the difference value is determined by subtracting the pixel value of the pixel point corresponding to the lower half image from the pixel value of the pixel point of the upper half image at the joint when determining the difference value, the compensation may be performed by adding the corresponding compensation value to the pixel value of the pixel point of the lower half image or by subtracting the corresponding compensation value from the pixel value of the pixel point of the upper half image when compensating.
It should be appreciated that, when determining the difference value, it may also be determined by subtracting the pixel value of the pixel point of the lower half image from the pixel value of the corresponding pixel point of the upper half image at the joint, in which case, when compensating, compensation may be achieved by subtracting the corresponding compensation value from the pixel value of the pixel point of the lower half image, or compensation may also be achieved by adding the corresponding compensation value to the pixel value of the pixel point of the upper half image, which is not described in detail herein.
By adopting the scheme, the joint phenomenon of the spliced detector can be effectively eliminated, so that the image effect is improved, and the effect diagram can be shown as a figure 7D.
It should be noted that, in this embodiment of the present application, after the seam is eliminated on the infrared image acquired by the infrared stitching detector according to the above manner, one or more processes such as vertical streak removal, overall horizontal streak removal, image enhancement, histogram equalization and the like may be performed on the processed image respectively, so as to further optimize the image effect, and specific implementation thereof will not be described herein.
The methods provided herein are described above. The apparatus provided in this application is described below:
referring to fig. 10, a schematic structural diagram of an image seam eliminating device according to an embodiment of the present application is shown in fig. 10, where the image seam eliminating device may include:
a correction unit 1010, configured to perform non-uniformity correction on an image acquired by the infrared focal plane detector, to obtain a corrected image; the focal plane of the infrared focal plane detector comprises at least two parts, the at least two parts are spliced into the focal plane, and each part is driven by an independent circuit;
an extracting unit 1020, configured to extract, for any seam in the corrected image, a difference value of pixels at two sides of the seam in the corrected image; the seam corresponds to a splice location of two adjacent portions of the at least two portions of the focal plane;
And a processing unit 1030, configured to eliminate the seam according to the difference value.
In some embodiments, the processing unit 1030 eliminates the seam according to the difference value, including:
performing target and background segmentation on the corrected image, and determining an effective difference value and an ineffective difference value in the difference values according to segmentation results;
replacing the invalid difference value according to the valid difference value to obtain a replaced difference value;
and eliminating the joint according to the replaced difference value.
In some embodiments, the processing unit 1030 may eliminate the seam according to the replaced difference value, including:
smoothing the replaced difference value to obtain a filtered difference value;
and eliminating the joint according to the filtered difference value.
In some embodiments, for any one of the difference values, when two pixel points on two sides of the seam corresponding to the difference value are both backgrounds, the difference value is an effective difference value; otherwise, the difference value is an invalid difference value.
In some embodiments, the processing unit 1030 replaces the invalid difference value according to the valid difference value, including:
Determining an average value of the effective difference values;
and replacing the invalid difference value with an average value of the valid difference values.
Referring to fig. 11, a schematic structural diagram of an image seam eliminating device according to an embodiment of the present application is shown in fig. 11, where the image seam eliminating device may include:
a correction unit 1110, configured to perform non-uniformity correction on an image to be processed, so as to obtain a corrected image to be processed;
an extracting unit 1120, configured to extract, for any seam in the corrected image to be processed, a difference value of pixels at two sides of the seam in the corrected image to be processed;
processing unit 1130 is configured to eliminate the seam according to the difference value.
In some embodiments, the processing unit 1130 eliminates the seam according to the difference value, including:
performing target and background segmentation on the corrected image to be processed, and determining an effective difference value and an ineffective difference value in the difference values according to segmentation results;
replacing the invalid difference value according to the valid difference value to obtain a replaced difference value;
and eliminating the joint according to the replaced difference value.
In some embodiments, the processing unit 1130 eliminates the seam according to the replaced difference value, including:
smoothing the replaced difference value to obtain a filtered difference value;
and eliminating the joint according to the filtered difference value.
In some embodiments, for any one of the difference values, when two pixel points on two sides of the seam corresponding to the difference value are both backgrounds, the difference value is an effective difference value; otherwise, the difference value is an invalid difference value.
In some embodiments, the processing unit 1130 replaces the invalid difference value according to the valid difference value, including:
determining an average value of the effective difference values;
and replacing the invalid difference value with an average value of the valid difference values.
An embodiment of the present application provides an electronic device, including a processor and a memory, where the memory stores machine executable instructions executable by the processor, and the processor is configured to execute the machine executable instructions to implement the image seam elimination method described above.
Fig. 12 is a schematic hardware structure of an electronic device according to an embodiment of the present application. The electronic device may include a processor 1201, a memory 1202 storing machine-executable instructions. The processor 1201 and the memory 1202 may communicate via a system bus 1203. Also, the processor 1201 can perform the image seam elimination method described above by reading and executing machine-executable instructions in the memory 1202 corresponding to the image seam elimination logic.
The memory 1202 referred to herein may be any electronic, magnetic, optical, or other physical storage device that may contain or store information, such as executable instructions, data, or the like. For example, a machine-readable storage medium may be: RAM (Radom Access Memory, random access memory), volatile memory, non-volatile memory, flash memory, a storage drive (e.g., hard drive), a solid state drive, any type of storage disk (e.g., optical disk, dvd, etc.), or a similar storage medium, or a combination thereof.
In some embodiments, a machine-readable storage medium, such as memory 1202 in fig. 12, is also provided, having stored therein machine-executable instructions that when executed by a processor implement the image seam elimination method described above. For example, the machine-readable storage medium may be ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
Embodiments of the present application also provide a computer program stored on a machine-readable storage medium, such as memory 1202 in fig. 12, and which when executed by a processor causes the processor 1201 to perform the image seam elimination method described above.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The foregoing description of the preferred embodiments of the present invention is not intended to limit the invention to the precise form disclosed, and any modifications, equivalents, improvements and alternatives falling within the spirit and principles of the present invention are intended to be included within the scope of the present invention.

Claims (11)

1. An image seam elimination method, comprising:
carrying out non-uniformity correction on an image acquired by an infrared focal plane detector to obtain a corrected image; the focal plane of the infrared focal plane detector comprises at least two parts, the at least two parts are spliced into the focal plane, and each part is driven by an independent circuit;
extracting difference values of pixels at two sides of any joint in the corrected image for the joint in the corrected image; the seam corresponds to a splice location of two adjacent portions of the at least two portions of the focal plane;
eliminating the joint according to the difference value, including: performing target and background segmentation on the corrected image, and determining an effective difference value and an ineffective difference value in the difference values according to segmentation results; replacing the invalid difference value according to the valid difference value to obtain a replaced difference value; eliminating the joint according to the replaced difference value;
for any one of the difference values, when two pixel points at two sides of the seam corresponding to the difference value are all backgrounds, the difference value is an effective difference value; otherwise, the difference value is an invalid difference value.
2. The method of claim 1, wherein said removing the seam based on the replaced variance value comprises:
smoothing the replaced difference value to obtain a filtered difference value;
and eliminating the joint according to the filtered difference value.
3. The method of claim 1, wherein said replacing said invalid difference value in accordance with said valid difference value comprises:
determining an average value of the effective difference values;
and replacing the invalid difference value with an average value of the valid difference values.
4. An image seam elimination method, comprising:
carrying out non-uniformity correction on the image to be processed to obtain a corrected image to be processed;
extracting difference values of pixels at two sides of any joint in the corrected image to be processed;
eliminating the joint according to the difference value, including: performing target and background segmentation on the corrected image to be processed, and determining an effective difference value and an ineffective difference value in the difference values according to segmentation results; replacing the invalid difference value according to the valid difference value to obtain a replaced difference value; eliminating the joint according to the replaced difference value;
For any one of the difference values, when two pixel points at two sides of the seam corresponding to the difference value are all backgrounds, the difference value is an effective difference value; otherwise, the difference value is an invalid difference value.
5. The method of claim 4, wherein said removing the seam based on the replaced variance value comprises:
smoothing the replaced difference value to obtain a filtered difference value;
and eliminating the joint according to the filtered difference value.
6. The method of claim 4, wherein said replacing said invalid difference value as a function of said valid difference value comprises:
determining an average value of the effective difference values;
and replacing the invalid difference value with an average value of the valid difference values.
7. An image seam elimination device, comprising:
the correcting unit is used for carrying out non-uniformity correction on the image acquired by the infrared focal plane detector to obtain a corrected image; the focal plane of the infrared focal plane detector comprises at least two parts, the at least two parts are spliced into the focal plane, and each part is driven by an independent circuit;
An extracting unit, configured to extract, for any seam in the corrected image, a difference value of pixels at two sides of the seam in the corrected image; the seam corresponds to a splice location of two adjacent portions of the at least two portions of the focal plane;
a processing unit, configured to eliminate the seam according to the difference value, including: performing target and background segmentation on the corrected image to be processed, and determining an effective difference value and an ineffective difference value in the difference values according to segmentation results; replacing the invalid difference value according to the valid difference value to obtain a replaced difference value; eliminating the joint according to the replaced difference value;
for any one of the difference values, when two pixel points at two sides of the seam corresponding to the difference value are all backgrounds, the difference value is an effective difference value; otherwise, the difference value is an invalid difference value.
8. The apparatus of claim 7, wherein the processing unit to eliminate the seam based on the replaced difference value comprises:
smoothing the replaced difference value to obtain a filtered difference value;
Eliminating the joint according to the filtered difference value;
the processing unit replacing the invalid difference value according to the valid difference value, including:
determining an average value of the effective difference values;
and replacing the invalid difference value with an average value of the valid difference values.
9. An image seam elimination device, comprising:
the correction unit is used for carrying out non-uniformity correction on the image to be processed to obtain a corrected image to be processed;
an extracting unit, configured to extract, for any seam in the corrected image to be processed, a difference value of pixels at two sides of the seam in the corrected image to be processed;
a processing unit, configured to eliminate the seam according to the difference value, including: performing target and background segmentation on the corrected image to be processed, and determining an effective difference value and an ineffective difference value in the difference values according to segmentation results; replacing the invalid difference value according to the valid difference value to obtain a replaced difference value; eliminating the joint according to the replaced difference value; for any one of the difference values, when two pixel points at two sides of the seam corresponding to the difference value are all backgrounds, the difference value is an effective difference value; otherwise, the difference value is an invalid difference value.
10. The apparatus of claim 9, wherein the processing unit to eliminate the seam based on the replaced difference value comprises:
smoothing the replaced difference value to obtain a filtered difference value;
eliminating the joint according to the filtered difference value;
the processing unit replacing the invalid difference value according to the valid difference value, including:
determining an average value of the effective difference values;
and replacing the invalid difference value with an average value of the valid difference values.
11. An electronic device comprising a processor and a memory, the memory storing machine executable instructions executable by the processor for executing the machine executable instructions to implement the method of any of claims 1-3 or 4-6.
CN202011540592.8A 2020-12-23 2020-12-23 Image seam elimination method and device and electronic equipment Active CN112700382B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011540592.8A CN112700382B (en) 2020-12-23 2020-12-23 Image seam elimination method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011540592.8A CN112700382B (en) 2020-12-23 2020-12-23 Image seam elimination method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN112700382A CN112700382A (en) 2021-04-23
CN112700382B true CN112700382B (en) 2024-03-26

Family

ID=75509367

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011540592.8A Active CN112700382B (en) 2020-12-23 2020-12-23 Image seam elimination method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN112700382B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102708558A (en) * 2012-06-01 2012-10-03 惠州华阳通用电子有限公司 Video image mosaic device, mosaic method and video surveillance system
CN105530443A (en) * 2015-12-17 2016-04-27 天津天地伟业数码科技有限公司 Multichannel CCD (Charge Couple Device) image splicing seam elimination method based on FPGA
CN105931203A (en) * 2016-04-26 2016-09-07 成都市晶林科技有限公司 Infrared image stripe filtering method based on statistical relative stripe removal method
CN106780396A (en) * 2016-12-30 2017-05-31 上海集成电路研发中心有限公司 A kind of method for eliminating image piece
CN107967078A (en) * 2016-10-20 2018-04-27 南京仁光电子科技有限公司 A kind of scaling method of mosaic screen touch point
WO2019052534A1 (en) * 2017-09-15 2019-03-21 腾讯科技(深圳)有限公司 Image stitching method and device, and storage medium
CN111879412A (en) * 2020-08-03 2020-11-03 烟台艾睿光电科技有限公司 Image generation method and device for refrigeration type infrared detector and readable storage medium
CN111932478A (en) * 2020-08-10 2020-11-13 国科天成(北京)科技有限公司 Self-adaptive non-uniform correction method for uncooled infrared focal plane

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6912322B2 (en) * 2000-12-14 2005-06-28 Itt Manufacturing Enterprises Inc. Adaptive process for removing streaks in multi-band digital images
TWI533675B (en) * 2013-12-16 2016-05-11 國立交通大學 Optimal dynamic seam adjustment system and method for images stitching

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102708558A (en) * 2012-06-01 2012-10-03 惠州华阳通用电子有限公司 Video image mosaic device, mosaic method and video surveillance system
CN105530443A (en) * 2015-12-17 2016-04-27 天津天地伟业数码科技有限公司 Multichannel CCD (Charge Couple Device) image splicing seam elimination method based on FPGA
CN105931203A (en) * 2016-04-26 2016-09-07 成都市晶林科技有限公司 Infrared image stripe filtering method based on statistical relative stripe removal method
CN107967078A (en) * 2016-10-20 2018-04-27 南京仁光电子科技有限公司 A kind of scaling method of mosaic screen touch point
CN106780396A (en) * 2016-12-30 2017-05-31 上海集成电路研发中心有限公司 A kind of method for eliminating image piece
WO2019052534A1 (en) * 2017-09-15 2019-03-21 腾讯科技(深圳)有限公司 Image stitching method and device, and storage medium
CN111879412A (en) * 2020-08-03 2020-11-03 烟台艾睿光电科技有限公司 Image generation method and device for refrigeration type infrared detector and readable storage medium
CN111932478A (en) * 2020-08-10 2020-11-13 国科天成(北京)科技有限公司 Self-adaptive non-uniform correction method for uncooled infrared focal plane

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Radiometric calibration of Mars HiRISE high resolution imagery based on FPGA;Hou, Y等;Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci.;全文 *
内河船舶安全***中的红外图像拼接;郭栋梁等;重庆工学院学报(自然科学版);第23卷(第10期);全文 *
基于电阻阵拼接的红外场景生成方法研究;杨超军等;红外技术;第36卷(第06期);全文 *
遥感图像拼接缝消除算法研究;王军;中国硕博论文全文库信息科技辑(第07期);全文 *
遥感彩色影像镶嵌拼接缝的消除方法研究;周廷刚;计算机工程与应用(第36期);第84-86页 *

Also Published As

Publication number Publication date
CN112700382A (en) 2021-04-23

Similar Documents

Publication Publication Date Title
US9105107B2 (en) Apparatus and method for image processing
De et al. Multi-focus image fusion using a morphology-based focus measure in a quad-tree structure
CN108492262B (en) No-ghost high-dynamic-range imaging method based on gradient structure similarity
JP4454657B2 (en) Blur correction apparatus and method, and imaging apparatus
CN110866924A (en) Line structured light center line extraction method and storage medium
JP5890547B2 (en) Image processing device
JP2015011717A (en) Ghost artifact detection and removal methods in hdr image processing using multi-scale normalized cross-correlation
CN111161172B (en) Infrared image column direction stripe eliminating method, system and computer storage medium
WO2010021009A1 (en) Image correction device and image correction method
CN110147816B (en) Method and device for acquiring color depth image and computer storage medium
CN115578286A (en) High dynamic range hybrid exposure imaging method and apparatus
CN112017130B (en) Image restoration method based on self-adaptive anisotropic total variation regularization
CN112700382B (en) Image seam elimination method and device and electronic equipment
CN110175972B (en) Infrared image enhancement method based on transmission map fusion
US11302017B2 (en) Generating composite image from multiple images captured for subject
CN103985089B (en) With reference to weight edge analysis and the image streak correction method of frame inner iteration
Graham et al. Blind restoration of space-variant Gaussian-like blurred images using regional PSFs
CN111612720B (en) Wide-angle infrared image optimization method, system and related components
WO2017153410A1 (en) Method for generating a noise-reduced image based on a noise model of multiple images, as well as camera system and motor vehicle
Filipe et al. Improved patch-based view rendering for focused plenoptic cameras with extended depth-of-field
Lee et al. Multi-image high dynamic range algorithm using a hybrid camera
CN116703958B (en) Edge contour detection method, system, equipment and storage medium for microscopic image
Junjie et al. An image defocus deblurring method based on gradient difference of boundary neighborhood
CN116109640B (en) Workpiece surface small defect detection method in industrial detection
Shoji et al. Shape from focus using color segmentation and bilateral filter

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant