CN115063408B - Image processing method, image processing device, computer equipment and storage medium - Google Patents

Image processing method, image processing device, computer equipment and storage medium Download PDF

Info

Publication number
CN115063408B
CN115063408B CN202210894948.0A CN202210894948A CN115063408B CN 115063408 B CN115063408 B CN 115063408B CN 202210894948 A CN202210894948 A CN 202210894948A CN 115063408 B CN115063408 B CN 115063408B
Authority
CN
China
Prior art keywords
image
pixel
characteristic
processed
position information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210894948.0A
Other languages
Chinese (zh)
Other versions
CN115063408A (en
Inventor
曾辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yuexin Semiconductor Technology Co.,Ltd.
Original Assignee
Guangzhou Yuexin Semiconductor Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Yuexin Semiconductor Technology Co Ltd filed Critical Guangzhou Yuexin Semiconductor Technology Co Ltd
Priority to CN202210894948.0A priority Critical patent/CN115063408B/en
Publication of CN115063408A publication Critical patent/CN115063408A/en
Application granted granted Critical
Publication of CN115063408B publication Critical patent/CN115063408B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/40Document-oriented image-based pattern recognition
    • G06V30/42Document-oriented image-based pattern recognition based on the type of document
    • G06V30/422Technical drawings; Geographical maps
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30148Semiconductor; IC; Wafer

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Quality & Reliability (AREA)
  • Artificial Intelligence (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The application relates to an image processing method, an image processing device, a computer device and a storage medium. The method comprises the following steps: acquiring an image to be processed, wherein the image to be processed is a scanning electron microscope image; preprocessing the image to be processed to obtain a characteristic image corresponding to each marginal area in the image to be processed; performing edge tracking on each characteristic image to obtain an edge curve of each edge region; and correspondingly superposing each edge curve on the image to be processed to form a target image with the identified edge curve. By adopting the method, the figure edge can be accurately identified.

Description

Image processing method, image processing device, computer equipment and storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method and apparatus, a computer device, and a storage medium.
Background
The patterning process (photolithography and etching) is used to transfer the design layout to the surface of the wafer, and is a key link in the chip manufacturing process. The difference between the size of the wafer surface pattern and the design layout is the core evaluation index of the patterning process performance.
To compensate for the difference between the wafer surface pattern size and the design layout, an Optical Proximity Correction (OPC) technique is usually used to compensate and correct each line segment of the original design layout, so that the developed (ADI) and etched (AEI) patterns can restore the design layout as faithfully as possible.
And whether the ADI graph and the AEI graph are faithful to restore the design layout is detected, the edge profiles of the ADI graph and the AEI graph need to be determined, lines of the ADI graph and the AEI graph have certain line widths, the edge profiles of the ADI graph and the AEI graph are difficult to accurately determine, whether the ADI graph and the AEI graph are faithful to restore the design layout is difficult to accurately detect, accurate feedback adjustment is difficult to perform, and the compensation correction effect is poor.
Disclosure of Invention
In view of the above, it is necessary to provide an image processing method, an apparatus, a computer device, a computer readable storage medium, and a computer program product, which can accurately determine and identify an edge curve, in view of the above technical problems.
In a first aspect, the present application provides an image processing method, comprising:
acquiring an image to be processed, wherein the image to be processed is a scanning electron microscope image;
preprocessing the image to be processed to obtain a characteristic image corresponding to each marginal area in the image to be processed;
performing edge tracking on each characteristic image to obtain an edge curve of each edge region;
and correspondingly superposing each edge curve on the image to be processed to form a target image with the identified edge curve.
In one embodiment, the pre-processing the image to be processed includes:
filtering the image to be processed;
carrying out binarization processing on the filtered image to obtain a binary gray image;
performing connected domain segmentation on the binary gray level image to segment each connected domain image, wherein the connected domain image comprises at least one marginal area;
and acquiring the characteristic image corresponding to each connected domain image according to each connected domain image and the filtered image.
In one embodiment, obtaining the feature image corresponding to each connected component image according to each connected component image and the filtered image includes:
multiplying the gray value of the pixel in each connected domain image with the gray value of the corresponding pixel in the filtered image to obtain the target gray value of each connected domain image pixel;
and acquiring each characteristic image based on each connected domain image and each target gray value.
In one embodiment, edge tracking the feature image to obtain an edge curve of the edge region includes:
determining the position information of the (i + 1) th characteristic pixel according to the position information of the ith characteristic pixel and the position information of the (i-1) th characteristic pixel on the characteristic image until the position information of the last characteristic pixel of the marginal area is obtained, wherein i is more than or equal to 2;
and determining the edge curve according to the position information from the first characteristic pixel to the last characteristic pixel.
In one embodiment, before the step of obtaining the position information of the ith feature pixel and the position information of the (i-1) th feature pixel on the feature image, the method further comprises the following steps:
acquiring position information of a first feature pixel on the edge region, wherein when the edge region is connected with a boundary corresponding to the feature image, the first feature pixel is a pixel with the highest gray value on the image boundary, and when the target edge region is not connected with the boundary of the target edge region image, the first feature pixel is a pixel with the highest gray value;
and determining the position information of a second characteristic pixel according to the position information of the first characteristic pixel, wherein the second characteristic pixel is positioned on the vertical axis of the first characteristic pixel and is adjacent to the first characteristic pixel.
In one embodiment, determining the position information of the (i + 1) th feature pixel according to the position information of the ith feature pixel and the position information of the (i-1) th feature pixel on the feature image comprises:
determining a candidate pixel of the (i + 1) th characteristic pixel according to the position information of the ith characteristic pixel and the position information of the (i-1) th characteristic pixel on the characteristic image;
and determining the position information of the candidate pixel with the highest gray value in the candidate pixels as the position information of the (i + 1) th characteristic pixel.
In one embodiment, determining a candidate pixel of the i +1 th feature pixel according to the position information of the i-th feature pixel and the position information of the i-1 st feature pixel on the feature image comprises:
determining a candidate extending direction of the edge curve according to the position information of the ith feature pixel and the position information of the (i-1) th feature pixel on the feature image, wherein the extending direction of the (i-1) th feature pixel to the ith feature pixel is taken as a reference direction, and an included angle between the candidate extending direction and the reference direction is less than 90 degrees;
and determining a pixel which is positioned in the candidate extension direction and adjacent to the ith characteristic pixel as a candidate pixel of the (i + 1) th characteristic pixel.
In one embodiment, before performing the pre-processing on the image to be processed, the method further includes:
compressing the image to be processed according to a preset compression ratio;
before each edge curve is correspondingly superimposed on the image to be processed, the method further comprises the following steps:
and expanding the edge curve according to the preset compression ratio.
In a second aspect, the present application further provides an image processing apparatus. The device comprises:
the device comprises a first acquisition module, a second acquisition module and a processing module, wherein the first acquisition module is used for acquiring an image to be processed, and the image to be processed is a scanning electron microscope image;
the preprocessing module is used for preprocessing the image to be processed to acquire a characteristic image corresponding to each edge region of the image to be processed;
a second obtaining module, configured to perform edge tracking on each feature image to obtain an edge curve of each edge region;
and the superposition module is used for correspondingly superposing each edge curve on the image to be processed to form a target image with the identified edge curve.
In a third aspect, the present application also provides a computer device. The computer device comprises a memory storing a computer program and a processor implementing the following steps when executing the computer program:
acquiring an image to be processed, wherein the image to be processed is a scanning electron microscope image;
preprocessing the image to be processed to obtain a characteristic image corresponding to each marginal area in the image to be processed;
performing edge tracking on each characteristic image to obtain an edge curve of each edge region;
and correspondingly superposing each edge curve on the image to be processed to form a target image with the identified edge curve.
In a fourth aspect, the present application further provides a computer-readable storage medium. The computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
acquiring an image to be processed, wherein the image to be processed is a scanning electron microscope image;
preprocessing the image to be processed to obtain a characteristic image corresponding to each marginal area in the image to be processed;
performing edge tracking on each characteristic image to obtain an edge curve of each edge region;
and correspondingly superposing each edge curve on the image to be processed to form a target image with the identified edge curve.
In a fifth aspect, the present application further provides a computer program product. The computer program product comprising a computer program which when executed by a processor performs the steps of:
acquiring an image to be processed, wherein the image to be processed is a scanning electron microscope image;
preprocessing the image to be processed to obtain a characteristic image corresponding to each marginal area in the image to be processed;
performing edge tracking on each characteristic image to obtain an edge curve of each edge region;
and correspondingly superposing each edge curve on the image to be processed to form a target image with the identified edge curve.
According to the image processing method, the image processing device, the computer equipment, the storage medium and the computer program product, the image to be processed is preprocessed to obtain each target edge region image of the image to be processed; further acquiring an edge curve of each target edge region image; after the edge curves are determined, the edge curves are superposed on the image to be processed, and the graph edges can be accurately identified, so that whether the ADI graph and the AEI graph are faithfully restored to the design layout or not can be accurately determined.
Drawings
FIG. 1 is a flow diagram illustrating a method for image processing according to one embodiment;
FIG. 2 is a schematic illustration of an image to be processed in one embodiment;
3-7 are schematic diagrams of feature images obtained by preprocessing the image shown in FIG. 2;
FIG. 8 is a target image formed by superimposing edge curves on the image shown in FIG. 2, wherein the black curves in the edge regions are edge curves;
FIG. 9 is a schematic flow chart illustrating the preprocessing step performed on the image to be processed according to an embodiment;
fig. 10 is a schematic diagram of a binary grayscale image obtained by filtering and binarizing the image shown in fig. 2;
FIGS. 11-15 are schematic diagrams of connected component images obtained by performing connected component segmentation on the image shown in FIG. 10;
FIG. 16 is a diagram illustrating the determination of candidate pixels for the (i + 1) th feature pixel in one embodiment;
FIG. 17 is a block diagram showing a configuration of an image processing apparatus according to an embodiment;
FIG. 18 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more clearly understood, the present application is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of and not restrictive on the broad application.
In one embodiment, as shown in fig. 1, the present application provides an image processing method including:
s101: and acquiring an image to be processed, wherein the image to be processed is a scanning electron microscope image.
In application, the image to be processed may be a gray scale image without measurement results and other marks.
S102: and preprocessing the image to be processed to obtain a characteristic image corresponding to each marginal area in the image to be processed.
It can be understood that the feature images corresponding to the edge regions of the image to be processed are obtained by preprocessing the image to be processed, so that the edge regions of the image to be processed are divided, and the edge regions of the image to be processed are processed respectively in the following. Due to the problem of image resolution, the boundary between two different patterns on the scanning electron microscope image is a boundary region with a certain line width, i.e. an edge region.
Exemplarily, when the image to be processed is the image shown in fig. 2, there are an edge region 11, an edge region 12, an edge region 13, an edge region 14, and an edge region 15 in fig. 2, each feature image obtained by preprocessing the image to be processed is the images of fig. 3 to 7, the edge region 11a in fig. 3 corresponds to the edge region 11 in fig. 2, the edge region 12a in fig. 4 corresponds to the edge region 12 in fig. 2, the edge region 13a in fig. 5 corresponds to the edge region 13 in fig. 2, the edge region 14a in fig. 6 corresponds to the edge region 14 in fig. 2, and the edge region 15a in fig. 7 corresponds to the edge region 15 in fig. 2.
S103: and carrying out edge tracking on each characteristic image to obtain an edge curve of each edge region.
Wherein the edge curve is located in the edge region.
S104: and correspondingly superposing each edge curve on the image to be processed to form a target image with the identified edge curve.
Specifically, as shown in fig. 8, an edge curve (a black curve portion) is correspondingly superimposed on the image to be processed, and a target image in which the edge curve is identified is formed, so that an edge profile of each pattern in the image to be processed, for example, an edge profile of a groove, is determined.
In the embodiment, each target marginal area image of the image to be processed is obtained by preprocessing the image to be processed; further acquiring an edge curve of each target edge region image; after the edge curves are determined, the edge curves are superposed on the image to be processed, and the graph edges can be accurately identified, so that whether the ADI graph and the AEI graph are faithfully restored to the design layout or not can be accurately determined.
In one embodiment, as shown in fig. 9, the pre-processing of the image to be processed includes:
s901: and carrying out filtering processing on the image to be processed.
The image to be processed may be filtered by a median filtering method. The median filtering method is a non-linear smoothing technique, and sets the gray value of each pixel as the median of the gray values of all pixel points in a certain neighborhood window of the pixel. By performing median filtering on the image to be processed, salt and pepper noise in the image to be processed can be removed.
S902: and carrying out binarization processing on the filtered image to obtain a binary gray image.
Specifically, the filtering image is subjected to binarization processing, that is, the grayscale value of the pixel in the filtering image with the grayscale value smaller than the preset threshold is set to 0, and the grayscale value of the pixel in the filtering image with the grayscale value larger than the preset threshold is set to a corresponding preset value, for example, 255. Thereby separating the background area from the edge area. After the image to be processed shown in fig. 2 is subjected to filtering processing, an image obtained by binarization processing is shown in fig. 10, and the edge regions 11, 12, 13, 14, and 15 in fig. 2 are edge regions 11a, 12a, 13a, 14a, and 15a in fig. 10.
Illustratively, after graying out the picture, we select 127 (half of the grayscale range) as the threshold, i.e., all pixel values with pixel values greater than 127 are set to 255, and all pixel values less than 127 are set to 0.
S903: and performing connected domain segmentation on the binary gray level image to segment each connected domain image, wherein each connected domain image comprises at least one marginal area.
It can be understood that, in the binary gray image, the pixel with the gray value of zero is a background pixel, the pixel with the gray value larger than zero is a graphic pixel, the gray values of the graphic pixels are the same, and the connected region is a graphic region composed of pixels with the same gray value and adjacent positions, so that the connected region segmentation can be realized according to the position information and the gray value of each pixel in the binary gray image, and the connected region images each including different edge regions are segmented. Exemplarily, the connected component image shown in fig. 11 to 15 is obtained by performing connected component segmentation on the binary grayscale image shown in fig. 10, and the edge regions 11a, 12a, 13a, 14a, and 15a in fig. 10 are segmented into the connected component image shown in fig. 11 to 15.
S904: and acquiring the characteristic image corresponding to each connected domain image according to each connected domain image and the filtered image.
Specifically, each feature image is acquired based on each connected domain image and the filtered image, so that the gray value of the corresponding background pixel in each feature image is zero, the gray values of the rest pixels correspond to the gray values of the corresponding pixels in the filtered image, and the gray values of the corresponding image pixels are no longer the same, so that the edge curve of each connected domain image can be determined based on the gray values of the pixels.
In one embodiment, acquiring a feature image corresponding to each connected domain image according to each connected domain image and the filtered image includes: multiplying the gray value of the pixel in each connected domain image with the gray value of the corresponding pixel in the filtered image to obtain the target gray value of the pixel of each connected domain image; and acquiring each characteristic image based on each connected domain image and each target gray value.
Illustratively, the feature images shown in fig. 3 to 7 may be obtained based on the connected component images shown in fig. 11 to 15 and the respective target grayscale values, and accordingly, the edge regions 11a, 12a, 13a, 14a, 15a become the edge regions 11b, 12b, 13b, 14b, 15b.
Setting the gray value of the pixel with the gray value larger than the preset threshold value in the filtered image as a preset value X, and after determining the target gray value of the pixel in each connected domain image, setting the gray value of the pixel in each connected domain image as the target gray value/X to obtain each characteristic image.
It can be understood that, since the gray value of the background pixel is zero, no matter how much the gray value of the pixel (the same as the position information of the background pixel) corresponding to the filtered image and the background pixel is, after the gray value of the pixel in each connected domain image is multiplied by the gray value of the pixel corresponding to the filtered image, the gray value of the background pixel is still zero, and the gray value of the graphic pixel is reset, so that the edge curve is determined according to the reset gray value of the graphic pixel.
For example, the a pixel in the filtered image corresponds to the background pixel, and the B pixel corresponds to the graphics pixel, after the above steps, the gray value corresponding to the a pixel in the feature image is zero, and the gray value corresponding to the B pixel in the feature image is the same as the gray value of the B pixel.
In one embodiment, the binarizing processing the filtered image to obtain a binary grayscale image includes: acquiring the gray value of each pixel of the filtered image; setting the gray value of the pixel with the gray value smaller than the preset threshold value as zero, and setting the gray value of the pixel with the gray value larger than the preset threshold value as one.
It can be understood that after the gray value of the pixel with the gray value larger than the preset threshold is set as one, after the target gray value of the pixel in each connected domain image is determined, the gray value of the pixel in each connected domain image is directly set as the target gray value to obtain each feature image.
In one embodiment, edge tracking the feature image to obtain an edge curve of the edge region comprises: determining the position information of the (i + 1) th characteristic pixel according to the position information of the ith characteristic pixel and the position information of the (i-1) th characteristic pixel on the characteristic image until the position information of the last characteristic pixel of the edge area is obtained, wherein i is more than or equal to 2; and determining an edge curve according to the position information from the first characteristic pixel to the last characteristic pixel.
Therefore, the position information of the (i + 1) th characteristic pixel is determined according to the position information of the ith characteristic pixel and the position information of the (i-1) th characteristic pixel on the characteristic image, the error probability in the extending direction can be reduced, and the formation of an error closed curve is avoided.
In one embodiment, before the position information of the ith characteristic pixel and the position information of the (i-1) th characteristic pixel on the characteristic image, the method further comprises the following steps: acquiring position information of a first feature pixel on an edge region, wherein the first feature pixel is a pixel with the highest gray value on the image boundary when the edge region is connected with the boundary of a corresponding feature image, and the first feature pixel is a pixel with the highest gray value when the target edge region is not connected with the boundary of the target edge region image; and determining the position information of a second characteristic pixel according to the position information of the first characteristic pixel, wherein the second characteristic pixel is positioned on the vertical axis of the first characteristic pixel and is adjacent to the first characteristic pixel.
It can be understood that the first feature pixel is an initial point of the edge curve, and when the edge region is connected to the boundary of the corresponding feature image, if the first feature pixel is not located on the boundary of the corresponding feature image, the determined edge curve may not be complete, and therefore, when the edge region is connected to the boundary of the corresponding feature image, the first feature pixel is a pixel with the highest gray value on the image boundary, so as to determine the initial point of the edge curve from the image boundary, and avoid the edge curve from being incomplete.
Specifically, when the first feature pixel is located on the image boundary, the second feature pixel is uniquely determined. When the first feature pixel is not on the image boundary, there are two pixels that meet the condition of the second feature pixel, and one of the two pixels is selected as the second feature pixel. In application, when there are two pixels meeting the condition of the reference pixel, the pixel below the first feature pixel may be defaulted as the second feature pixel.
In one embodiment, determining the position information of the (i + 1) th feature pixel according to the position information of the ith feature pixel and the position information of the (i-1) th feature pixel on the feature image comprises: determining a candidate pixel of the (i + 1) th feature pixel according to the position information of the ith feature pixel and the position information of the (i-1) th feature pixel on the feature image; and determining the position information of the candidate pixel with the highest gray value in the candidate pixels as the position information of the (i + 1) th characteristic pixel.
In one embodiment, determining a candidate pixel of the (i + 1) th feature pixel according to the position information of the ith feature pixel and the position information of the (i-1) th feature pixel on the feature image comprises: determining a candidate extending direction of the edge curve according to the position information of the ith characteristic pixel and the position information of the (i-1) th characteristic pixel on the characteristic image, wherein the extending direction of the (i-1) th characteristic pixel to the ith characteristic pixel is taken as a reference direction, and the included angle between the candidate extending direction and the reference direction is less than 90 degrees; and determining pixels which are positioned in the candidate extension direction and adjacent to the ith characteristic pixel as candidate pixels.
It is understood that when the angle between the candidate extending direction and the reference direction is less than 90 degrees, there are only 3 pixels adjacent to the i-th feature pixel in the candidate extending direction. In application, the included angle between the candidate extending direction and the reference direction can be determined to be 0 degree and 45 degrees.
Exemplarily, fig. 16 is a gray scale diagram of pixel points in an edge region, where the grid points in fig. 16 are pixels, (xc, yc) are a current point of a determined edge curve, i.e., an ith feature pixel, and (x 0, y 0) is a last point of the determined edge curve, i.e., an i-1 th feature pixel, and a method for determining an i +1 th feature pixel includes: the reference direction is obtained by forward extension (x 0, y 0) - > (xc, yc), the pixels (x 1, y 1), (x 2, y 2), and (x 3, y 3) adjacent to (xc, yc) are located in the candidate extension direction, the pixels (x 1, y 1), (x 2, y 2), and (x 3, y 3) are candidate pixels, and then the grayscale values of the three candidate pixels are read respectively, and the candidate pixel with the largest grayscale value is selected as the i +1 th feature pixel.
In one embodiment, before preprocessing the image to be processed, the method further includes: compressing the image to be processed according to a preset compression ratio; before each edge curve is correspondingly superimposed on the image to be processed, the method further comprises the following steps: and expanding the edge curve according to a preset compression ratio.
In this embodiment, the processing time of the subsequent process is reduced by compressing the image to be processed. Since each edge curve is an edge curve obtained based on the compressed image to be processed, and the original image to be processed is not compressed, in order that each edge curve can be correspondingly superimposed on the image to be processed, the edge curve needs to be expanded based on a preset compression ratio so as to correspond to the size of the original image to be processed.
It should be understood that, although the steps in the flowcharts related to the embodiments as described above are sequentially displayed as indicated by arrows, the steps are not necessarily performed sequentially as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a part of the steps in the flowcharts related to the embodiments described above may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the execution order of the steps or stages is not necessarily sequential, but may be rotated or alternated with other steps or at least a part of the steps or stages in other steps.
Based on the same inventive concept, the embodiment of the present application further provides an image processing apparatus for implementing the image processing method. The implementation scheme for solving the problem provided by the apparatus is similar to the implementation scheme described in the method, so specific limitations in one or more embodiments of the image processing apparatus provided below may refer to the limitations on the image processing method in the foregoing, and details are not described here again.
In one embodiment, as shown in fig. 17, there is provided an image processing apparatus 1700 including: a first fetch module 1701, a pre-processing module 1702, a second fetch module 1703, and a superposition module 1704, wherein:
a first obtaining module 1701, configured to obtain an image to be processed, where the image to be processed is a scanning electron microscope image;
a preprocessing module 1702, configured to preprocess an image to be processed to obtain a feature image corresponding to each edge region of the image to be processed;
a second obtaining module 1703, configured to perform edge tracking on each feature image to obtain an edge curve of each edge region;
and an overlaying module 1704, configured to correspondingly overlay each edge curve on the image to be processed to form a target image with the edge curves identified.
In one embodiment, the pre-processing module 1702 includes: the device comprises a filtering submodule, a binarization processing submodule, a segmentation submodule and an acquisition submodule, wherein the filtering submodule is used for filtering an image to be processed; the binarization processing submodule is used for carrying out binarization processing on the filtered image to obtain a binary gray image; the segmentation submodule is used for carrying out connected domain segmentation on the binary gray level image so as to segment each connected domain image, wherein each connected domain image comprises at least one marginal area; the obtaining submodule is used for obtaining the characteristic images corresponding to the connected domain images according to the connected domain images and the filtered images.
In one embodiment, the acquisition submodule includes: the multiplying unit is used for multiplying the gray value of the pixel in each connected domain image with the gray value of the corresponding pixel in the filtered image so as to obtain the target gray value of the pixel of each connected domain image; the acquisition unit is used for acquiring each characteristic image based on each connected domain image and each target gray value.
In one embodiment, the second obtaining module 1703 includes: the first determining submodule is used for determining the position information of the (i + 1) th characteristic pixel according to the position information of the ith characteristic pixel and the position information of the (i-1) th characteristic pixel on the characteristic image until the position information of the last characteristic pixel of the edge area is obtained, wherein i is more than or equal to 2; the first determining submodule is used for determining an edge curve according to the position information from the first characteristic pixel to the last characteristic pixel.
In one embodiment, the second obtaining module 1703 further includes: the positioning sub-module is used for acquiring the position information of a first characteristic pixel on the edge region, wherein when the edge region is connected with the boundary of the corresponding characteristic image, the first characteristic pixel is the pixel with the highest gray value on the image boundary, and when the target edge region is not connected with the boundary of the target edge region image, the first characteristic pixel is the pixel with the highest gray value; and the third determining submodule is used for determining the position information of a second characteristic pixel according to the position information of the first characteristic pixel, wherein the second characteristic pixel is positioned on the vertical axis of the first characteristic pixel and is adjacent to the first characteristic pixel.
In one embodiment, the first determination submodule includes: the first determining unit is used for determining a candidate pixel of the (i + 1) th characteristic pixel according to the position information of the ith characteristic pixel and the position information of the (i-1) th characteristic pixel on the characteristic image; the second determination unit is configured to determine position information of a candidate pixel having a highest gray scale value among the candidate pixels as position information of the (i + 1) th feature pixel.
In one embodiment, the first determination unit includes: the extension subunit is used for determining a candidate extension direction of the edge curve according to the position information of the ith feature pixel and the position information of the (i-1) th feature pixel on the feature image, wherein the direction from the (i-1) th feature pixel to the ith feature pixel is taken as a reference direction, and an included angle between the candidate extension direction and the reference direction is smaller than 90 degrees; the determining subunit is configured to determine a pixel located in the candidate extension direction and adjacent to the i-th feature pixel as a candidate pixel of the i + 1-th feature pixel.
In one embodiment, the image processing apparatus 1700 further includes: the device comprises a compression module and an expansion module, wherein the compression module is used for compressing an image to be processed according to a preset compression ratio; the expansion module is used for expanding the edge curve according to a preset compression ratio.
The respective modules in the image processing apparatus 1700 described above may be implemented in whole or in part by software, hardware, and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a terminal, and its internal structure diagram may be as shown in fig. 18. The computer apparatus includes a processor, a memory, a communication interface, a display unit, and an input device connected through a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless communication can be realized through WIFI, a mobile cellular network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement an image processing method. The display unit of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
It will be appreciated by those skilled in the art that the configuration shown in fig. 18 is a block diagram of only a portion of the configuration associated with the present application, and is not intended to limit the computing device to which the present application may be applied, and that a particular computing device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided, comprising a memory and a processor, the memory having a computer program stored therein, the processor implementing the following steps when executing the computer program:
acquiring an image to be processed, wherein the image to be processed is a scanning electron microscope image;
preprocessing an image to be processed to obtain a characteristic image corresponding to each edge region in the image to be processed;
performing edge tracking on each characteristic image to obtain an edge curve of each edge region;
and correspondingly superposing each edge curve on the image to be processed to form a target image with the identified edge curve.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
filtering the image to be processed; carrying out binarization processing on the filtered image to obtain a binary gray image; performing connected domain segmentation on the binary gray level image to segment each connected domain image, wherein each connected domain image comprises at least one marginal area; and acquiring a characteristic image corresponding to each connected domain image according to each connected domain image and the filtered image.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
multiplying the gray value of the pixel in each connected domain image with the gray value of the corresponding pixel in the filtered image to obtain the target gray value of the pixel of each connected domain image; and acquiring each characteristic image based on each connected domain image and each target gray value.
In one embodiment, the processor when executing the computer program further performs the steps of:
determining the position information of the (i + 1) th characteristic pixel according to the position information of the ith characteristic pixel and the position information of the (i-1) th characteristic pixel on the characteristic image until the position information of the last characteristic pixel of the marginal area is obtained, wherein i is more than or equal to 2; and determining an edge curve according to the position information from the first characteristic pixel to the last characteristic pixel.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
acquiring position information of a first feature pixel on an edge region, wherein the first feature pixel is a pixel with the highest gray value on the image boundary when the edge region is connected with the boundary of a corresponding feature image, and the first feature pixel is a pixel with the highest gray value when the target edge region is not connected with the boundary of the target edge region image; and determining the position information of a second characteristic pixel according to the position information of the first characteristic pixel, wherein the second characteristic pixel is positioned on the vertical axis of the first characteristic pixel and is adjacent to the first characteristic pixel.
In one embodiment, the processor when executing the computer program further performs the steps of:
determining a candidate pixel of the (i + 1) th feature pixel according to the position information of the ith feature pixel and the position information of the (i-1) th feature pixel on the feature image; and determining the position information of the candidate pixel with the highest gray scale value in all the candidate pixels as the position information of the (i + 1) th characteristic pixel.
In one embodiment, the processor when executing the computer program further performs the steps of:
determining a candidate extending direction of the edge curve according to the position information of the ith characteristic pixel and the position information of the (i-1) th characteristic pixel on the characteristic image, wherein the extending direction of the (i-1) th characteristic pixel to the ith characteristic pixel is taken as a reference direction, and an included angle between the candidate extending direction and the reference direction is less than 90 degrees; and determining a pixel which is positioned in the candidate extension direction and adjacent to the ith characteristic pixel as a candidate pixel of the (i + 1) th characteristic pixel.
In one embodiment, the processor, when executing the computer program, further performs the steps of: before preprocessing an image to be processed, compressing the image to be processed according to a preset compression ratio; and expanding the edge curves according to a preset compression ratio before correspondingly superposing the edge curves on the image to be processed.
In one embodiment, a computer-readable storage medium is provided, having a computer program stored thereon, which when executed by a processor, performs the steps of:
acquiring an image to be processed, wherein the image to be processed is a scanning electron microscope image;
preprocessing an image to be processed to obtain a characteristic image corresponding to each edge region in the image to be processed;
performing edge tracking on each characteristic image to obtain an edge curve of each edge region;
and correspondingly superposing each edge curve on the image to be processed to form a target image with the identified edge curve.
In one embodiment, the computer program when executed by the processor further performs the steps of:
filtering the image to be processed; carrying out binarization processing on the filtered image to obtain a binary gray image; performing connected domain segmentation on the binary gray level image to segment each connected domain image, wherein each connected domain image comprises at least one marginal area; and acquiring the characteristic image corresponding to each connected domain image according to each connected domain image and the filtered image.
In one embodiment, the computer program when executed by the processor further performs the steps of:
multiplying the gray value of the pixel in each connected domain image with the gray value of the corresponding pixel in the filtered image to obtain the target gray value of the pixel of each connected domain image; and acquiring each characteristic image based on each connected domain image and each target gray value.
In one embodiment, the computer program when executed by the processor further performs the steps of:
determining the position information of the (i + 1) th characteristic pixel according to the position information of the ith characteristic pixel and the position information of the (i-1) th characteristic pixel on the characteristic image until the position information of the last characteristic pixel of the marginal area is obtained, wherein i is more than or equal to 2; and determining an edge curve according to the position information from the first characteristic pixel to the last characteristic pixel.
In one embodiment, the computer program when executed by the processor further performs the steps of:
acquiring position information of a first feature pixel on an edge region, wherein the first feature pixel is a pixel with the highest gray value on the image boundary when the edge region is connected with the boundary of a corresponding feature image, and the first feature pixel is a pixel with the highest gray value when the target edge region is not connected with the boundary of the target edge region image; and determining the position information of a second characteristic pixel according to the position information of the first characteristic pixel, wherein the second characteristic pixel is positioned on the vertical axis of the first characteristic pixel and is adjacent to the first characteristic pixel.
In one embodiment, the computer program when executed by the processor further performs the steps of:
determining a candidate pixel of the (i + 1) th feature pixel according to the position information of the ith feature pixel and the position information of the (i-1) th feature pixel on the feature image; and determining the position information of the candidate pixel with the highest gray value in the candidate pixels as the position information of the (i + 1) th characteristic pixel.
In one embodiment, the computer program when executed by the processor further performs the steps of:
determining a candidate extending direction of the edge curve according to the position information of the ith characteristic pixel and the position information of the (i-1) th characteristic pixel on the characteristic image, wherein the extending direction of the (i-1) th characteristic pixel to the ith characteristic pixel is taken as a reference direction, and an included angle between the candidate extending direction and the reference direction is smaller than 90 degrees; and determining a pixel which is positioned in the candidate extension direction and adjacent to the ith characteristic pixel as a candidate pixel of the (i + 1) th characteristic pixel.
In one embodiment, the computer program when executed by the processor further performs the steps of: before preprocessing an image to be processed, compressing the image to be processed according to a preset compression ratio; and expanding the edge curves according to a preset compression ratio before correspondingly superposing the edge curves on the image to be processed.
In one embodiment, a computer program product is provided, comprising a computer program which, when executed by a processor, performs the steps of:
acquiring an image to be processed, wherein the image to be processed is a scanning electron microscope image;
preprocessing an image to be processed to obtain a characteristic image corresponding to each marginal area in the image to be processed;
performing edge tracking on each characteristic image to obtain an edge curve of each edge region;
and correspondingly superposing each edge curve on the image to be processed to form a target image with the identified edge curve.
In one embodiment, the computer program when executed by the processor further performs the steps of:
filtering the image to be processed; carrying out binarization processing on the filtered image to obtain a binary gray image; performing connected domain segmentation on the binary gray level image to segment each connected domain image, wherein each connected domain image comprises at least one marginal area; and acquiring the characteristic image corresponding to each connected domain image according to each connected domain image and the filtered image.
In one embodiment, the computer program when executed by the processor further performs the steps of:
multiplying the gray value of the pixel in each connected domain image with the gray value of the corresponding pixel in the filtered image to obtain the target gray value of the pixel of each connected domain image; and acquiring each characteristic image based on each connected domain image and each target gray value.
In one embodiment, the computer program when executed by the processor further performs the steps of:
determining the position information of the (i + 1) th characteristic pixel according to the position information of the ith characteristic pixel and the position information of the (i-1) th characteristic pixel on the characteristic image until the position information of the last characteristic pixel of the edge area is obtained, wherein i is more than or equal to 2; and determining an edge curve according to the position information from the first characteristic pixel to the last characteristic pixel.
In one embodiment, the computer program when executed by the processor further performs the steps of:
acquiring position information of a first feature pixel on an edge region, wherein the first feature pixel is a pixel with the highest gray value on the image boundary when the edge region is connected with the boundary of a corresponding feature image, and the first feature pixel is a pixel with the highest gray value when the target edge region is not connected with the boundary of the target edge region image; and determining the position information of a second characteristic pixel according to the position information of the first characteristic pixel, wherein the second characteristic pixel is positioned on the vertical axis of the first characteristic pixel and is adjacent to the first characteristic pixel.
In one embodiment, the computer program when executed by the processor further performs the steps of:
determining a candidate pixel of the (i + 1) th feature pixel according to the position information of the ith feature pixel and the position information of the (i-1) th feature pixel on the feature image; and determining the position information of the candidate pixel with the highest gray value in the candidate pixels as the position information of the (i + 1) th characteristic pixel.
In one embodiment, the computer program when executed by the processor further performs the steps of:
determining a candidate extending direction of the edge curve according to the position information of the ith characteristic pixel and the position information of the (i-1) th characteristic pixel on the characteristic image, wherein the extending direction of the (i-1) th characteristic pixel to the ith characteristic pixel is taken as a reference direction, and an included angle between the candidate extending direction and the reference direction is less than 90 degrees; and determining a pixel which is positioned in the candidate extension direction and adjacent to the ith characteristic pixel as a candidate pixel of the (i + 1) th characteristic pixel.
In one embodiment, the computer program when executed by the processor further performs the steps of: before preprocessing an image to be processed, compressing the image to be processed according to a preset compression ratio; and expanding the edge curves according to a preset compression ratio before correspondingly superposing the edge curves on the image to be processed.
It should be noted that the user information (including but not limited to user device information, user personal information, etc.) and data (including but not limited to data for analysis, stored data, displayed data, etc.) referred to in the present application are information and data authorized by the user or sufficiently authorized by each party.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above may be implemented by hardware instructions of a computer program, which may be stored in a non-volatile computer-readable storage medium, and when executed, may include the processes of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high-density embedded nonvolatile Memory, resistive Random Access Memory (ReRAM), magnetic Random Access Memory (MRAM), ferroelectric Random Access Memory (FRAM), phase Change Memory (PCM), graphene Memory, and the like. Volatile Memory can include Random Access Memory (RAM), external cache Memory, and the like. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), for example. The databases referred to in various embodiments provided herein may include at least one of relational and non-relational databases. The non-relational database may include, but is not limited to, a block chain based distributed database, and the like. The processors referred to in the various embodiments provided herein may be, without limitation, general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic devices, quantum computing-based data processing logic devices, or the like.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present application shall be subject to the appended claims.

Claims (11)

1. An image processing method, characterized in that the method comprises:
acquiring an image to be processed, wherein the image to be processed is a scanning electron microscope image;
preprocessing the image to be processed to obtain a characteristic image corresponding to each marginal area in the image to be processed;
acquiring position information of a first feature pixel on the edge area, wherein when the edge area is connected with a boundary corresponding to the feature image, the first feature pixel is a pixel with the highest gray value on the boundary of the feature image, and when the edge area is not connected with the boundary of the feature image, the first feature pixel is a pixel with the highest gray value on the edge area;
determining position information of a second feature pixel on the edge area according to the position information of the first feature pixel on the edge area, wherein the second feature pixel is located on a vertical axis of the first feature pixel and is adjacent to the first feature pixel;
determining a candidate extending direction of an edge curve of the edge area according to the position information of the ith characteristic pixel and the position information of the (i-1) th characteristic pixel on the edge area, wherein the extending direction of the (i-1) th characteristic pixel to the ith characteristic pixel is taken as a reference direction, an included angle between the candidate extending direction and the reference direction is less than 90 degrees, and i is greater than or equal to 2;
determining a pixel which is positioned in the candidate extension direction and adjacent to the ith characteristic pixel as a candidate pixel of the (i + 1) th characteristic pixel;
determining the position information of the candidate pixel with the highest gray value in all the candidate pixels as the position information of the (i + 1) th characteristic pixel until the position information of the last characteristic pixel of the edge area is obtained;
determining an edge curve of each edge region according to the position information from the first characteristic pixel to the last characteristic pixel of each edge region;
and correspondingly superposing each edge curve on the image to be processed to form a target image with the identified edge curve.
2. The image processing method according to claim 1, wherein preprocessing the image to be processed comprises:
filtering the image to be processed;
carrying out binarization processing on the filtered image to obtain a binary gray image;
performing connected domain segmentation on the binary gray level image to segment each connected domain image, wherein the connected domain image comprises at least one marginal area;
and acquiring the characteristic image corresponding to each connected domain image according to each connected domain image and the filtered image.
3. The image processing method according to claim 2, wherein obtaining the feature image corresponding to each connected component image according to each connected component image and the filtered image comprises:
multiplying the gray value of the pixel in each connected domain image with the gray value of the corresponding pixel in the filtered image to obtain the target gray value of each connected domain image pixel;
and acquiring each characteristic image based on each connected domain image and each target gray value.
4. The image processing method according to claim 2, wherein binarizing the filtered image to obtain a binary grayscale image, comprises: acquiring the gray value of each pixel of the filtered image; and setting the gray value of the pixel with the gray value smaller than the preset threshold value as zero, and setting the gray value of the pixel with the gray value larger than the preset threshold value as one.
5. The image processing method according to claim 1, wherein before preprocessing the image to be processed, further comprising:
compressing the image to be processed according to a preset compression ratio;
before each edge curve is correspondingly superimposed on the image to be processed, the method further comprises the following steps:
and expanding the edge curve according to the preset compression ratio.
6. An image processing apparatus, characterized in that the apparatus comprises:
the device comprises a first acquisition module, a second acquisition module and a processing module, wherein the first acquisition module is used for acquiring an image to be processed, and the image to be processed is a scanning electron microscope image;
the preprocessing module is used for preprocessing the image to be processed to acquire a characteristic image corresponding to each edge region of the image to be processed;
a second obtaining module, configured to obtain position information of a first feature pixel on the edge area, where the first feature pixel is a pixel with a highest gray value on a boundary of the feature image when the edge area is connected to the boundary corresponding to the feature image, and the first feature pixel is a pixel with a highest gray value on the edge area when the edge area is not connected to the boundary of the feature image; determining position information of a second feature pixel on the edge area according to the position information of the first feature pixel on the edge area, wherein the second feature pixel is located on a vertical axis of the first feature pixel and is adjacent to the first feature pixel; determining a candidate extending direction of an edge curve of the edge area according to the position information of the ith characteristic pixel and the position information of the (i-1) th characteristic pixel on the edge area, wherein the extending direction of the (i-1) th characteristic pixel to the ith characteristic pixel is taken as a reference direction, an included angle between the candidate extending direction and the reference direction is less than 90 degrees, and i is greater than or equal to 2; determining a pixel which is positioned in the candidate extension direction and adjacent to the ith characteristic pixel as a candidate pixel of the (i + 1) th characteristic pixel; determining the position information of the candidate pixel with the highest gray value in the candidate pixels as the position information of the (i + 1) th characteristic pixel until the position information of the last characteristic pixel of the edge area is obtained; determining an edge curve of each edge region according to the position information from the first characteristic pixel to the last characteristic pixel of each edge region;
and the superposition module is used for correspondingly superposing each edge curve on the image to be processed to form a target image with the identified edge curve.
7. The image processing apparatus according to claim 6, wherein the preprocessing module comprises:
the filtering submodule is used for carrying out filtering processing on the image to be processed;
a binarization processing submodule for performing binarization processing on the filtered image to obtain a binary gray image;
the segmentation sub-module is used for carrying out connected domain segmentation on the binary gray level image to segment each connected domain image, wherein each connected domain image comprises at least one marginal area;
and the obtaining submodule is used for obtaining the characteristic images corresponding to the connected domain images according to the connected domain images and the images after the filtering processing.
8. The image processing apparatus according to claim 7, wherein the acquisition sub-module includes:
a multiplication unit, configured to multiply the grayscale value of the pixel in each connected domain image with the grayscale value of the corresponding pixel in the filtered image to obtain a target grayscale value of each connected domain image pixel
And the acquisition unit is used for acquiring each characteristic image based on each connected domain image and each target gray value.
9. The image processing apparatus according to claim 6, characterized by further comprising:
the compression module is used for compressing the image to be processed according to a preset compression ratio;
and the expansion module is used for expanding the edge curve according to a preset compression ratio.
10. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method of any of claims 1 to 5.
11. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 5.
CN202210894948.0A 2022-07-28 2022-07-28 Image processing method, image processing device, computer equipment and storage medium Active CN115063408B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210894948.0A CN115063408B (en) 2022-07-28 2022-07-28 Image processing method, image processing device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210894948.0A CN115063408B (en) 2022-07-28 2022-07-28 Image processing method, image processing device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN115063408A CN115063408A (en) 2022-09-16
CN115063408B true CN115063408B (en) 2022-12-02

Family

ID=83205356

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210894948.0A Active CN115063408B (en) 2022-07-28 2022-07-28 Image processing method, image processing device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115063408B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102193302A (en) * 2010-03-03 2011-09-21 中芯国际集成电路制造(上海)有限公司 Mask image defection detection method and detection system thereof
CN103543394A (en) * 2013-10-27 2014-01-29 华北电力大学(保定) Discharge ultraviolet imaging quantization parameter extraction method of high-voltage electric equipment
CN104317159A (en) * 2010-03-03 2015-01-28 中芯国际集成电路制造(上海)有限公司 Mask graphic defect detection method and mask graphic defect detection system for
CN110619318A (en) * 2019-09-27 2019-12-27 腾讯科技(深圳)有限公司 Image processing method, microscope, system and medium based on artificial intelligence
CN113808109A (en) * 2021-09-17 2021-12-17 贵州电网有限责任公司 Method for detecting zero value insulator discharge distribution by using GaN red ultraviolet sensor

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102006026843A1 (en) * 2006-06-09 2007-12-20 Carl Zeiss Nts Gmbh Method for processing a digital gray value image
US10484601B2 (en) * 2015-08-31 2019-11-19 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium
CN107507208B (en) * 2017-07-12 2020-04-10 天津大学 Image feature point extraction method based on curvature estimation on contour
CN113298837B (en) * 2021-07-27 2021-11-26 南昌工程学院 Image edge extraction method and device, storage medium and equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102193302A (en) * 2010-03-03 2011-09-21 中芯国际集成电路制造(上海)有限公司 Mask image defection detection method and detection system thereof
CN104317159A (en) * 2010-03-03 2015-01-28 中芯国际集成电路制造(上海)有限公司 Mask graphic defect detection method and mask graphic defect detection system for
CN103543394A (en) * 2013-10-27 2014-01-29 华北电力大学(保定) Discharge ultraviolet imaging quantization parameter extraction method of high-voltage electric equipment
CN110619318A (en) * 2019-09-27 2019-12-27 腾讯科技(深圳)有限公司 Image processing method, microscope, system and medium based on artificial intelligence
CN113808109A (en) * 2021-09-17 2021-12-17 贵州电网有限责任公司 Method for detecting zero value insulator discharge distribution by using GaN red ultraviolet sensor

Also Published As

Publication number Publication date
CN115063408A (en) 2022-09-16

Similar Documents

Publication Publication Date Title
JP6792842B2 (en) Visual inspection equipment, conversion data generation equipment, and programs
EP1347410A2 (en) Image processing method and apparatus
WO2013091088A1 (en) Image registration method and system robust to noise
US20160292838A1 (en) Image synthesis apparatus, image synthesis method, and recording medium
CN111539238B (en) Two-dimensional code image restoration method and device, computer equipment and storage medium
CN117011304B (en) Defect detection method, defect detection device, computer equipment and computer readable storage medium
CN112560864B (en) Image semantic segmentation method and device and training method of image semantic segmentation model
Sun et al. A general non-local denoising model using multi-kernel-induced measures
Mishra et al. Development of robust neighbor embedding based super-resolution scheme
CN115063408B (en) Image processing method, image processing device, computer equipment and storage medium
Climer et al. Local lines: A linear time line detector
Karaca et al. Interpolation-based image inpainting in color images using high dimensional model representation
CN115272428A (en) Image alignment method and device, computer equipment and storage medium
WO2022188102A1 (en) Depth image inpainting method and apparatus, camera assembly, and electronic device
CN115797194A (en) Image denoising method, image denoising device, electronic device, storage medium, and program product
Hung et al. Learning-based image interpolation via robust k-NN searching for coherent AR parameters estimation
CN115965856B (en) Image detection model construction method, device, computer equipment and storage medium
CN116563357B (en) Image matching method, device, computer equipment and computer readable storage medium
CN115600619B (en) Dot matrix two-dimensional code identification method and device, electronic equipment and storage medium
CN116659520B (en) Matching positioning method, device and equipment based on bionic polarization vision enhancement
CN111882588B (en) Image block registration method and related product
CN113838138B (en) System calibration method, system, device and medium for optimizing feature extraction
CN116952954B (en) Concave-convex detection method, device, equipment and storage medium based on stripe light
CN113850307A (en) Image modification area positioning method and system based on deep learning and storage medium
CN118229585A (en) Depth image restoration method, device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 510700 No. 28, Fenghuang fifth road, Huangpu District, Guangzhou, Guangdong

Patentee after: Yuexin Semiconductor Technology Co.,Ltd.

Address before: 510700 No. 28, Fenghuang fifth road, Huangpu District, Guangzhou, Guangdong

Patentee before: Guangzhou Yuexin Semiconductor Technology Co.,Ltd.

CP01 Change in the name or title of a patent holder