WO2020051746A1 - 图像的边缘检测方法、图像处理设备及计算机存储介质 - Google Patents
图像的边缘检测方法、图像处理设备及计算机存储介质 Download PDFInfo
- Publication number
- WO2020051746A1 WO2020051746A1 PCT/CN2018/104892 CN2018104892W WO2020051746A1 WO 2020051746 A1 WO2020051746 A1 WO 2020051746A1 CN 2018104892 W CN2018104892 W CN 2018104892W WO 2020051746 A1 WO2020051746 A1 WO 2020051746A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- edge point
- detection area
- edge
- neighborhood
- point
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
Definitions
- the present application relates to the field of image detection, and in particular, to an image edge detection method, an image processing device, and a computer storage medium.
- Edge detection is a basic operation in image processing. It can be applied to different fields. For example, in the industrial field, edge detection is used to detect the surface quality of the workpiece. Specifically, the image of the workpiece surface is obtained, and then the edge detection is performed on the image of the workpiece surface. To detect the presence of scratches on the surface of the workpiece. In specific applications, the surface of the workpiece is often scratched lightly, and the edge contrast in the image of the corresponding workpiece surface is weak, and it is not easy to detect.
- the present application provides an image edge detection method, an image processing device, and a computer storage medium to solve the problem that it is difficult to detect weak edges in an image in the prior art.
- the present application provides an image edge detection method.
- the method includes: obtaining a neighborhood difference degree of each pixel in a first detection area of the image, and extracting a first edge based on the neighborhood difference degree. Point; determine the second detection area according to the first edge point, wherein the second detection area is smaller than the first detection area; perform magnification processing on the neighborhood difference of each pixel point in the second detection area, and based on the magnified neighboring area
- the second edge point is extracted from the domain difference degree; the edge of the image is determined according to the first edge point and the second edge point.
- the present application provides an image processing device.
- the device includes a processor and a memory.
- the memory stores a computer program, and the processor is configured to execute the computer program to implement the foregoing method.
- the present application provides a computer storage medium.
- the computer storage medium is used to store a computer program, and the computer program can be executed to implement the foregoing method.
- the second detection area of the region to further perform edge point extraction in the local range.
- the neighborhood difference degree of each pixel is enlarged, and the second is extracted based on the neighborhood difference degree after the enlargement process.
- Edge points; the extraction of the entire edge point is mainly divided into two processes. First, for the larger first detection area, the first edge point is extracted according to the degree of neighborhood difference.
- the degree of neighborhood difference of the unextracted edge points is weak, so Determine a small second detection area and enlarge its neighborhood difference to extract a second edge point, and then determine the edge of the image according to the first edge point and the second edge point.
- two edge points are extracted.
- the process implements detection of weak edges.
- FIG. 1 is a schematic flowchart of an embodiment of an image edge detection method according to the present application
- FIG. 2 is a schematic flowchart of another embodiment of an edge detection method of an image of the present application.
- FIG. 3 is a schematic diagram of obtaining a first edge point block on an image in an embodiment shown in FIG. 2;
- FIG. 4 is a schematic diagram of obtaining a first edge point block on an image in another embodiment in the embodiment shown in FIG. 2;
- FIG. 5 is a schematic flowchart of extracting a second edge point in the embodiment shown in FIG. 2;
- FIG. 6 is a schematic diagram of extracting a second edge point on an image in the embodiment shown in FIG. 2;
- FIG. 7 is a schematic diagram of a weighting window in the embodiment shown in FIG. 2;
- FIG. 8 is a schematic flowchart of extracting a third edge point in the embodiment shown in FIG. 2;
- FIG. 9 is a schematic diagram of extracting a third edge point on an image in the embodiment shown in FIG. 2;
- FIG. 10 is a schematic flowchart of an embodiment of a method for detecting a surface of a workpiece of the present application
- FIG. 11 is a schematic structural diagram of an embodiment of an image processing apparatus according to the present application.
- FIG. 12 is a schematic structural diagram of an embodiment of a computer storage medium of the present application.
- This application is a method for edge detection of an image.
- Edge detection recognizes points with obvious changes in the image. Therefore, the method of this application implements the identification of edge points based on the degree of neighborhood difference of the pixel points, which mainly includes two extraction steps. First, the edge points are extracted based on the neighborhood difference, and then the neighborhood difference is enlarged, and the edge points are extracted based on the enlarged neighborhood difference; in order to detect the weak edges in the image.
- FIG. 1 is a schematic flowchart of an embodiment of an edge detection method of an image of the present application.
- the edge detection method of an image of this embodiment includes the following steps.
- S11 Obtain the neighborhood difference degree of each pixel point in the first detection area of the image, and extract the first edge point based on the neighborhood difference degree.
- This embodiment is used to detect an image.
- a detection area on the image is detected.
- the detection area may be the entire image or a certain area in the image. Areas are distinguished, and the detection area in this step is referred to as a first detection area.
- the degree of neighborhood difference of each pixel in the first detection area is obtained.
- the degree of neighborhood difference indicates the difference between the pixel and the neighboring pixels, and can be the difference in pixel values, a step value, and a two step.
- the edge points are extracted based on the degree of neighborhood difference, and pixels with significant differences from neighboring pixels can be extracted as edge points.
- the above process is the first extraction of edge points in this embodiment.
- the extracted edge point is called a first edge point.
- the first edge point is extracted according to the degree of neighborhood difference. For the edge points with obvious differences, it is more accurate to extract according to the process of step S11. For the edge points with insignificant differences, it is generally difficult to extract edges based on the degree of difference. Therefore, in this embodiment, the following steps are used to further extract the edge points.
- the second detection area is determined according to the first edge point. According to the first edge point that has been determined, it can be roughly known that the edge line corresponding to the edge point is in the image. Position, it is sufficient to detect the area where the edge line is in the image during the second detection. Therefore, in step S12, the second detection area is determined according to the first edge point, that is, the edge line is determined according to the first edge point. Possible areas, where the second detection area is smaller than the first detection area and the second detection area is within the first detection area. This step can determine the area for the second detection, so as to find the edge points more specifically. Can improve detection efficiency.
- the entire image is used as the first detection area.
- the smallest area including all the first edge points is used as the second detection area.
- the edge of the second detection area follows the first detection area.
- the edge has the same shape as the first detection area. For example, a dashed box is shown in FIG. 3.
- S13 Enlarge the neighborhood difference degree of each pixel point in the second detection area, and extract a second edge point based on the neighborhood difference degree after the enlargement process.
- the edge points not found in the above step S11 are further extracted.
- the neighborhood difference degree of each pixel point in the second detection area is enlarged.
- the enlargement processing can convert the pixels The difference between the point and its neighboring pixels is enlarged, so that the edge points that are not obvious are highlighted, and then the second edge point is extracted based on the neighborhood difference degree after the enlargement process. After the neighborhood difference degree is enlarged, the second The extraction of edge points is more accurate.
- S14 Determine the edge of the image according to the first edge point and the second edge point.
- the edges of the image can be determined according to the extracted edge points twice, for example, a fitting operation is performed on all the edge points to determine the edge lines in the image.
- the first extraction of edge points is performed according to the neighborhood difference degree, and then the neighborhood difference degree is enlarged, and the second extraction of the edge points is performed according to the enlarged neighborhood difference degree to find the weaker difference degree.
- Edge point the edge of the image is determined based on the edge points extracted twice, so as to detect the weak edge in the image.
- FIG. 2 is a schematic flowchart of another embodiment of an edge detection method of an image of the present application.
- the edge detection method of an image of this embodiment includes the following steps.
- S21 Obtain the neighborhood difference degree of each pixel point in the first detection area of the image, and extract the first edge point based on the neighborhood difference degree.
- the neighborhood difference degree may be a difference between pixel values, a step value, or a two step value.
- a threshold is set for the difference value, a step value, or a two step value.
- the values are filtered to extract the first edge point.
- This process can be implemented using existing image edge algorithms, such as canny algorithm, sobel algorithm, and so on.
- S22 Classify the first edge point according to the neighborhood difference degree and / or the position relationship of the first edge point to obtain at least two first edge point blocks.
- the first edge point is classified, and each of the obtained first edge point packets includes at least one first edge point.
- the edge points that cannot be classified can be discarded, so as to remove the interference points in the first edge point extracted in step S21; for the classification operation, it can also be used to determine according to the first edge point block in the following step S23
- the second detection area makes the determination of the second detection area more targeted and efficient.
- the first edge point is classified according to the neighborhood difference degree and / or the position relationship of the first edge point, and specifically includes the following: Several classifications.
- the first classification method is to compare the neighborhood difference degree of the first edge point with a plurality of preset difference degree threshold sections, and use the first edge point whose neighborhood difference degree falls within the same difference degree threshold section as a class. Form a classification block; use the classification block as the first edge point block.
- the first edge point is classified according to a preset plurality of difference threshold threshold segments, thereby obtaining a first edge point block. It may also be implemented in step S21.
- the first edge point is extracted according to the neighborhood difference degree of the pixel point, multiple difference threshold segments are set, and the first edge point is filtered according to the difference threshold segment, while the neighborhood difference degree is dropped.
- the first edge points within the same difference threshold range are used as a class to form a first edge point block.
- FIG. 3 is a schematic diagram of obtaining a first edge point block on an image in a manner in the embodiment shown in FIG. 2.
- the first edge point is classified according to a plurality of threshold segments of difference degree, and 4 first edge point blocks are obtained: (a 1 , a 2 , a 3 ), (b 1 , b 2 ), (c 1 , c 2 ), (d 1 , d 2 , d 3 , d 4 ).
- the second classification method compare the neighborhood difference degree of the first edge point with a plurality of preset difference degree threshold segments, and use the first edge point whose neighborhood difference degree falls within the same difference degree threshold segment as a category, and Form a classification block; a classification block that reduces the difference between the corresponding difference threshold segments to less than a preset inter-segment difference threshold, and the shortest distance to an adjacent classification block is less than a preset distance threshold It is further classified into a secondary classification block, and the secondary classification block is used as a first edge point block.
- the shortest distance represents the shortest distance between classification packets.
- a classification block such as the four primary classification blocks formed in FIG. 3: (a 1 , a 2 , a 3 ), (b 1 , b 2 ), (c 1 , c 2 ), (d 1 , d 2 , d 3 , d 4 ).
- the primary classification block is further classified as a secondary classification block, and the criterion for judging whether multiple primary classification blocks are divided into one category is: the difference between the thresholds of the difference degree thresholds corresponding to the multiple primary classification blocks. Less than the preset difference threshold between segments, and the shortest distance between each primary classification block and its adjacent primary block is less than the preset distance threshold, if there are multiple adjacent primary blocks, it is equal to each There is the shortest distance between each classification packet, and the shortest distance among multiple shortest distances is used as the shortest distance for judgment.
- the difference between the segments is the difference between the maximum value, the difference between the minimum value, or the difference between the center values in the two difference threshold segments.
- the shortest distance is the shortest distance from the adjacent classification block, that is, the distance between the nearest pixels in two adjacent classification blocks, such as a classification block (a 1 , a 2 , a 3 )
- the shortest distance between the adjacent classification block (b 1 , b 2 ) is the distance between pixel point a 3 and pixel point b 1 ; the classification block (c 1 , c 2 ) is adjacent to it
- the shortest distance between the primary classification blocks (b 1 , b 2 ) is the distance between the pixels c 1 and b 2 .
- a classification block with a difference between segments less than a preset threshold for difference between segments and a shortest distance from an adjacent classification block that is less than the distance threshold it can be further classified as a secondary classification block.
- the distance of the first edge point in the classification block is judged, for example, the first edge point whose distance to the adjacent first edge point is greater than a threshold is discarded.
- FIG. 4 is a schematic diagram of obtaining a first edge point block on an image in another manner in the embodiment shown in FIG. 2.
- the primary classification blocks (a 1 , a 2 , a 3 ), (b 1 , b 2 ), (c 1 , c 2 ), (d 1 , d 2 , d 3 , d 4 ) are further classified as Secondary classification block (a 1 , a 2 , a 3 , b 1 , b 2 , c 1 , c 2 ), (d 1 , d 2 ), where the primary classification block (c 1 , c 2 ) and the primary classification block
- the shortest distance between the classification blocks (d 1 , d 2 ) is that the distance between c 2 and d 1 is greater than the threshold, so the primary classification block (d 1 , d 2 ) is separately classified as a secondary classification block;
- step S22 the first edge point is classified, and after the first edge point block is obtained, the second detection area is determined according to the first edge block in the subsequent steps.
- a second detection area may be determined between two adjacent first edge points in the first edge point block, and specifically, between the first edge points in the first edge point block may be calculated first.
- the distance between two first edge points whose distance exceeds a set range is determined as the second detection area.
- a first edge point that is relatively far apart is used as an end point of the second detection area, and the second detection area is determined according to the end point and the shape of the first detection area.
- the shape of the second detection area is the same as the shape of the first detection area.
- a second detection area may also be determined between two adjacent first edge points of adjacent first edge point packets. For two adjacent first edge point packets, between two adjacent first edge points Is determined as the second detection area. For example, two adjacent first edge points are used as endpoints of the second detection area, and the second detection area is determined according to the endpoint and the shape of the first detection area. The shape of the second detection area is the same as the shape of the first detection area .
- the second detection area is directly determined according to the first edge point, and the distances between all the first edge points need to be judged to determine the second detection area.
- the first area in the first edge point block is determined.
- the second detection area can be determined by performing an edge point distance calculation, so this step S23 is more efficient.
- the first edge point block is (a 1 , a 2 , a 3 , b 1 , b 2 , c 1 , c 2 ) and (d 1 , d 2 ), which can be determined through step S23.
- the second detection area is two adjacent first edge points whose distance within the first edge point packet exceeds a set range: an area between the pixels a 1 and a 2 , an area between the pixels b 1 and b 2 , The area between the pixels c 1 and c 2 and between two adjacent first edge points of the adjacent first edge point block: the area between the pixels c 2 and d 1 .
- the edge points in the second detection area are extracted.
- S24 Enlarge the neighborhood difference degree of each pixel point in the second detection area, and extract a second edge point based on the neighborhood difference degree after the enlargement process.
- step S24 the edge points not detected in step S21 are further extracted.
- the neighborhood difference degree of the pixel point is enlarged to further highlight the difference between the pixel point and the neighboring pixel point.
- the second edge point is extracted from the neighborhood difference degree of.
- FIG. 5 is a schematic flowchart of extracting a second edge point in the embodiment shown in FIG. 2.
- the extraction process of the second edge point includes the following steps.
- the setting of a weighting window on the second detection area in this step S241 refers to first dividing a detection window when detecting the second detection area, that is, only for the area covered by the weighting window each time the analysis is detected. Pixels are analyzed and detected.
- the weighting window limits the size and range of the analyzed data. Therefore, if the detection window is large, the analysis and detection data will be comprehensive, but the analysis and detection time will be longer each time. If the weighting window is smaller, then The analysis and testing time is short, but the analysis and testing data are not comprehensive enough.
- the weighting window may be a rectangular window, a circular window, a fan window, etc. In this embodiment, it is necessary to analyze the pixels in the second detection area. Therefore, a rectangular window is generally used.
- the weighting window may correspond to ( A rectangular window of 2n + 1) ⁇ (2n + 1) pixels, where n is an integer greater than or equal to 1, at this time, there is a center point in the weighted window of the rectangle, and there is also a center pixel in the pixels corresponding to its coverage area. point.
- the pixels covered by the weighting window at the current position may be weighted, where the weight value corresponding to the center pixel among the pixels covered by the weighting window is greater than that corresponding to other pixels The weighting value, so that the difference between the central pixel and other pixels can be enlarged after the weighting is completed in step S242.
- the weight value corresponding to other pixels that are farther away from the center pixel point can be set smaller, because the edge line in the image generally does not correspond to only one column of pixel points, but corresponds to multiple columns of pixel points, that is, the farther away from the edge point The larger the difference between the pixel point and the edge point. Therefore, when detecting an edge point, it is not desirable to enlarge the difference between the edge point and the pixel point next to it too much. If it is enlarged too much, it will be inconsistent with the actual situation and affect the judgment. Therefore, for the weighting window, the weighting values corresponding to other pixels that are farther away from the center pixel point can be set smaller.
- the weighting window shown in FIG. 6 is a schematic diagram of the weighting window in the embodiment shown in FIG. 2.
- the weighted window is a 5 ⁇ 5 rectangular window.
- the weight of the corresponding center pixel in the weighted window is 5, the weight of the pixel that is farthest away from the center pixel is 1, and the weight of the center pixel is closer. It is 2 or 3.
- the pixels covered by the weighting window are weighted, that is, the center pixel is multiplied by a weight of 5, the distance from the center pixel is multiplied by a weight of 2 or 3, and the distance from the center pixel is multiplied by the weight. 1.
- the weights in the weighting window are fixed.
- the weighting window moves relative to the detection area in the subsequent steps, other pixels are used as the center pixel, and the weight is multiplied by 5.
- the surrounding pixel points are based on the center pixel point. Multiply the distance from far to near by 1, 2, and 3.
- Weight the pixels that is, multiply the pixel values of the pixels by weights.
- the pixel value is larger, while the pixels of the adjacent pixels are smaller, and the pixels of the edge line are smaller.
- the central pixel point in the weighting window it is multiplied by a larger weight value, and its adjacent multiplied by a smaller weight value, the difference between the pixel points of the edge line and the adjacent pixel points can be enlarged, and then After weighting the pixels covered by the weighting window, the neighborhood difference between the central pixel and other pixels is calculated.
- the neighborhood difference calculated in step S243 is the neighborhood after the center pixel is enlarged. Domain difference.
- the calculation of the neighborhood difference degree is similar to that in step S21, and details are not described again.
- the maximum difference among the differences between the weighted central pixel and other pixels may be used as the degree of neighborhood difference.
- a weighted window can only complete the enlargement of the neighborhood difference degree of one central pixel.
- each pixel in the second detection area needs to be detected. Therefore, each pixel in the second detection area needs to be subjected to a neighborhood difference magnification process to extract a second edge point. Therefore, in this step S244, the second detection area and the weighting window are relatively moved, and steps S242-S243 are executed until the second detection area is traversed to complete the enlargement of the neighborhood difference degree of all the pixels in the second detection area.
- the relative movement of the second detection area and the weighting window may be to move the weighting window relative to the second detection area in a row direction or a column direction of the pixel with a step size of one pixel point.
- the weighting window is moved relative to the row direction of the pixels in the second detection area, and the neighborhood difference degree is sequentially magnified for a row of pixels, and then the step length of one pixel is moved relative to the direction of the pixel columns in the second detection area.
- the weighting window is moved relative to the row direction of the pixels of the second detection area, and the neighboring pixels of the next row are sequentially enlarged in the neighborhood.
- the weighting window and the second detection area are relatively moved in an S shape.
- the second edge point can be extracted based on the enlarged neighborhood difference degree.
- the pixel point with the larger neighborhood difference degree is considered as the second edge point.
- the second edge point can be extracted in various ways, such as the following Step S245 or step S246.
- S245 Compare the neighborhood difference degree after the enlargement process with a preset difference threshold, and select a pixel point corresponding to the neighborhood difference degree that is greater than the difference threshold value as the second edge point.
- the method for extracting the second edge point in step S245 is to set a difference threshold. If the neighborhood difference after the enlargement process is greater than the difference threshold, the corresponding pixel point is considered to be the second edge point.
- S246 Sort the neighborhood difference degree after the enlargement processing from large to small, and filter out the pixels corresponding to the preset number of neighborhood difference degrees that are ranked first as the second edge point.
- step S246 after the neighborhood difference values of all the pixels in the second detection area are enlarged, the neighborhood difference degrees after the enlargement processing are sorted from large to small, and the preset presets are filtered out.
- the number of pixels corresponding to the number of neighborhood differences is used as the second edge point.
- the preset number can be set according to the specific situation. If the setting is too large, it will not be able to filter the edge points. If the setting is too small, the edge will be easily lost. point.
- steps S245 and S246 are not all performed after the neighborhood difference degree amplification processing of all the pixels in the second detection area is performed, wherein step S245 can complete the neighborhood difference degree amplification of one pixel using a weighted window Then, the execution is started, and the neighborhood difference degree after the pixel point enlargement processing is compared with a preset difference threshold value to determine whether the pixel point is an edge point. Then, the second detection area and the weighting window are relatively moved to enlarge the neighborhood difference degree of the next pixel point and determine whether it is an edge point.
- Step S24 of extracting the second edge point can be understood with reference to FIG. 7, which is a schematic diagram of extracting the second edge point on the image in the embodiment shown in FIG. 2.
- step S23 determines a second detection zone 4: pixels a 1, region, between the pixels 2 a b 1, b 2 a region between the pixel points c 1, c 2 between the region, pixel c 2, the region between 1 d.
- the second edge point extraction e 1 in a 1, the region between the pixels a, b in pixel
- the second edge point e 2 is extracted from the region of the second pixel
- the second edge point e 3 is extracted from the region between the pixels c 1 and c 2
- the second edge point e 4 is extracted from the region between the pixels c 2 and d 1 .
- the first edge point and the second edge point can be fitted to determine the edge in the image. If the edge points are not enough to determine the edges accurately when processing the image, the following steps may be further used to extract the edge points in this embodiment.
- This step S25 may be based on the principle described in steps S12 and S23 in the above embodiment, and the third detection area is determined according to the second edge point. Similarly, the third detection area is smaller than the second detection area. 7 corresponds to, for example, in FIG, 3 may determine that the third detection region: the pixels a 1, region, between pixels e e 1 2, b area, between the pixel e 2 4, d 1 between the region. After the third detection area is determined, the edge points in the third detection area are extracted.
- S26 Perform average processing on the neighborhood difference of each pixel point in the third detection area, and extract a third edge point based on the neighborhood difference after the average processing.
- step S26 the edge points not detected in steps S21 and S24 are further extracted.
- the neighborhood difference degree of the pixel is averaged, and then the third edge is extracted based on the enlarged neighborhood difference degree. point.
- FIG. 8 is a schematic flowchart of extracting a third edge point in the embodiment shown in FIG. 2.
- the extraction of the third edge point includes the following steps.
- S261 A mean window is set in the third detection area.
- This step S261 is basically similar to the above step S241, and details are not described in detail.
- S262 Perform the average processing on the neighborhood difference degree of the pixels covered by the mean window at the current position, and use as the neighborhood difference degree of the central pixel among the pixels covered by the mean window.
- the mean difference processing is performed on the neighborhood differences of the pixels covered by the mean window at the current position, that is, the mean of the neighborhood differences of all pixels in the mean window is calculated, and then the calculated
- the mean value of d is the processed neighborhood difference degree of the central pixel in the mean window.
- step S244 the third detection region and the mean window are relatively moved, and the process returns to step S262 to perform the neighborhood difference average process on all the pixels in the third detection region.
- the third edge point is extracted according to the neighborhood difference degree after the mean processing, and the pixel point with the larger neighborhood difference degree is regarded as the third edge point.
- step S245 or step S246 the third edge point can be extracted according to the neighborhood difference degree after the average processing, and the following two methods can also be adopted.
- S264 Compare the neighborhood difference degree after the mean value processing with a preset difference threshold, and select a pixel point corresponding to the neighborhood difference degree that is greater than the difference threshold value as the third edge point.
- S265 Sort the neighborhood difference degrees after the average processing from large to small, and filter out pixels corresponding to a preset number of neighborhood difference degrees that are ranked first as the third edge point.
- steps S264 and S265 are not all executed after the mean value of the neighborhood difference degrees of all the pixels in the third detection area is processed, wherein step S264 can complete the neighborhood of one pixel using a mean window
- the difference degree average processing is executed several days later, and the neighborhood difference degree after the pixel point average processing is compared with a preset difference threshold to determine whether the pixel point is an edge point. Then, the third detection area and the mean window are relatively moved to perform the neighborhood difference mean processing on the next pixel point, and determine whether it is an edge point.
- Step S26 of extracting the third edge point can be understood with reference to FIG. 9, which is a schematic diagram of extracting the third edge point from the image in the embodiment shown in FIG. 2.
- Step S25 based on the third detection area is determined, in conjunction with the example of FIG. 7, step S25 determines a third detection zone 3: pixel points a 1, a region between e 1, pixel e 2, b 2 a region between the pixel point e 4, the region between 1 d.
- the third step S26 to extract edge points in said third detection area 3 the pixels a1, region extraction 1, 2, the region between the third edge pixel point f e b between the extraction e1
- the three edge points f 2 and the area between the pixel points e 4 and d 1 extract the third edge points f 3 and f 4 .
- S27 Determine the edge of the image according to the first edge point, the second edge point, and the third edge point.
- the edges of the image can be determined based on the extracted edge points three times, for example, performing a fitting operation on all the edge points to determine the Edge line.
- the first extraction of edge points is performed according to the neighborhood difference degree, and then the neighborhood difference degree is enlarged, and the second extraction of the edge points is performed according to the enlarged neighborhood difference degree to find the weaker difference degree.
- Edge points average the neighborhood differences again, and perform the third extraction of the edge points according to the neighborhood difference after the average processing, to further find the edge points with weaker differences; determine the image based on the three extracted edge points Edge to detect weak edges in the image.
- the edge detection of the image can be applied to the detection of the surface of the workpiece. Therefore, a method for detecting the surface of a workpiece is further provided in this application. Please refer to FIG. This embodiment can realize the detection of the surface quality of the workpiece, such as detecting whether there is a scratch on the surface of the workpiece.
- the detection method specifically includes the following steps.
- the workpiece surface is photographed to obtain an image of the workpiece surface.
- the workpiece surface may also be photographed multiple times to obtain multiple images of the workpiece surface, and then multiple images are superimposed and averaged to obtain a detection image, thereby reducing detection. Noise in the image.
- S32 Perform edge detection on the detected image.
- the edge detection method is used to perform edge detection on the detection image to determine the edges in the detection image, and the detection of the workpiece surface can be implemented correspondingly according to the edge detection result.
- an edge detection algorithm of an image is applied to the surface detection of a workpiece, and weak scratches on the surface of the workpiece can be detected.
- the application can be implemented by a detection device, and the logical process thereof is expressed by a computer program, and specifically, it is implemented by an image processing device.
- FIG. 11 is a schematic structural diagram of an embodiment of an image processing apparatus according to the present application.
- the image processing apparatus 100 of this embodiment includes a processor 11 and a memory 12.
- a computer program is stored in the memory 12, and the processor is configured to execute the computer program to implement the foregoing method.
- FIG. 12 is a schematic structural diagram of an embodiment of a computer storage medium of the present application.
- a computer program is stored in the computer storage medium 200 and can be used to implement the method for implementing the foregoing embodiments.
- the computer storage medium 200 may be a U disk, an optical disk, a server, or the like.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims (12)
- 一种图像的边缘检测方法,其特征在于,所述方法包括:获取所述图像的第一检测区域内各像素点的邻域差异度,并基于所述邻域差异度提取出第一边缘点;根据所述第一边缘点确定第二检测区域,其中所述第二检测区域小于所述第一检测区域;对所述第二检测区域内的各像素点的邻域差异度进行放大处理,并基于放大处理后的所述邻域差异度提取出第二边缘点;根据所述第一边缘点和所述第二边缘点确定所述图像的边缘。
- 根据权利要求1所述的方法,其特征在于,所述根据所述第一边缘点确定第二检测区域的步骤包括:根据所述第一边缘点的邻域差异度和/或位置关系对所述第一边缘点进行分类,以获得至少两个第一边缘点包块,其中每一个所述第一边缘点包块分别包括至少一个所述第一边缘点;根据所述第一边缘点包块确定所述第二检测区域。
- 根据权利要求2所述的方法,其特征在于,所述根据所述第一边缘点的邻域差异度和/或位置关系对所述第一边缘点进行分类的步骤包括:比较所述第一边缘点的邻域差异度和预设的多个差异度阈值段,将所述邻域差异度落在同一所述差异度阈值段内的第一边缘点作为一类,以形成一次分类包块;基于所述一次分类包块确定所述第一边缘点包块。
- 根据权利要求3所述的方法,其特征在于,所述基于所述一次分类包块确定所述第一边缘点包块的步骤包括:将所述一次分类包块作为所述第一边缘点包块;或者将对应的所述差异度阈值段之间的段间差异小于预设的段间差异阈值,且与相邻的一次分类包块之间的最短距离小于预设的距离阈值的所述一次分类包块进一步 归入二次分类包块,并将所述二次分类包块作为所述第一边缘点包块。
- 根据权利要求2所述的方法,其特征在于,所述根据所述第一边缘点包块确定所述第二检测区域的步骤:在每一所述第一边缘点包块中的相邻的所述第一边缘点之间确定所述第二检测区域;和/或在相邻的所述第一边缘点包块中的相邻的所述第一边缘点之间确定所述第二检测区域。
- 根据权利要求1所述的方法,其特征在于,所述对所述第二检测区域内的各像素点的邻域差异度进行放大处理,并基于放大处理后的所述邻域差异度提取出第二边缘点的步骤包括:在所述第二检测区域设置一加权窗口;对所述加权窗口在当前位置所覆盖的像素点进行加权,其中所述加权窗口所覆盖的像素点中的中心像素点所对应的加权值大于其他像素点所对应的加权值;计算加权后的所述中心像素点与所述其他像素点之间的邻域差异度,以作为所述中心像素点的放大处理后的邻域差异度;将所述第二检测区域和所述加权窗口进行相对移动,并返回所述对所述加权窗口在当前位置所覆盖的像素点进行加权的步骤,直至所述加权窗口遍历所述第二检测区域。
- 根据权利要求6所述的方法,其特征在于,距离所述中心像素点越远的所述其他像素点所对应的所述加权值越小。
- 根据权利要求1所述的方法,其特征在于,所述对所述第二检测区域内的各像素点的邻域差异度进行放大处理,并基于放大处理后的所述邻域差异度提取出第二边缘点的步骤包括:比较放大处理后的邻域差异度和预设的差异阈值,筛选出大于所述差异阈值的邻域差异度所对应的像素点作为所述第二边缘点;或者,将放大处理后的邻域差异度进行从大到小的排序,筛选出排序在前的预 设数量个邻域差异度所对应的像素点作为所述第二边缘点。
- 根据权利要求1所述的方法,其特征在于,所述根据所述第一边缘点和所述第二边缘点确定所述图像的边缘的步骤包括:根据所述第二边缘点确定第三检测区域,所述第三检测区域小于所述第二检测区域;对所述第三检测区域内的每一像素点的所述邻域差异度进行均值处理,并基于均值处理后的所述邻域差异度提取出第三边缘点;根据所述第一边缘点、所述第二边缘点和所述第三边缘点确定所述图像的边缘。
- 根据权利要求9所述的方法,其特征在于,所述对所述第三检测区域内的每一像素点的所述邻域差异度进行均值处理,并基于均值处理后的所述邻域差异度提取出第三边缘点的步骤包括:在所述第三检测区域设置一均值窗口;对所述均值窗口在当前位置所覆盖的像素点的邻域差异度进行均值处理,并作为所述均值窗口所覆盖的像素点中的中心像素点的新的邻域差异度;将所述第三检测区域和所述均值窗口进行相对移动,并返回所述对所述均值窗口在当前位置所覆盖的像素点的邻域差异度进行均值处理的步骤,直至所述均值窗口遍历所述第三检测区域。
- 一种图像处理设备,其特征在于,所述设备包括处理器和存储器,所述存储器中存储有计算机程序,所述处理器用于执行所述计算机程序以实现权利要求1-10中任一项所述的方法。
- 一种计算机存储介质,其特征在于,所述计算机存储介质用于存储计算机程序,所述计算机程序能够被执行以实现权利要求1-10中任一项所述的方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2018/104892 WO2020051746A1 (zh) | 2018-09-10 | 2018-09-10 | 图像的边缘检测方法、图像处理设备及计算机存储介质 |
CN201880087301.9A CN111630563B (zh) | 2018-09-10 | 2018-09-10 | 图像的边缘检测方法、图像处理设备及计算机存储介质 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2018/104892 WO2020051746A1 (zh) | 2018-09-10 | 2018-09-10 | 图像的边缘检测方法、图像处理设备及计算机存储介质 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020051746A1 true WO2020051746A1 (zh) | 2020-03-19 |
Family
ID=69776940
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2018/104892 WO2020051746A1 (zh) | 2018-09-10 | 2018-09-10 | 图像的边缘检测方法、图像处理设备及计算机存储介质 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN111630563B (zh) |
WO (1) | WO2020051746A1 (zh) |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115294141A (zh) * | 2022-10-10 | 2022-11-04 | 惠智赋能(滨州)信息科技服务有限公司 | 一种基于声纳图像的深海渔网检测方法 |
CN115564767A (zh) * | 2022-11-10 | 2023-01-03 | 深圳市岑科实业有限公司 | 基于机器视觉的电感绕线质量监测方法 |
CN115908429A (zh) * | 2023-03-08 | 2023-04-04 | 山东歆悦药业有限公司 | 一种泡脚药粉研磨精度检测方法及*** |
CN115984271A (zh) * | 2023-03-20 | 2023-04-18 | 山东鑫科来信息技术有限公司 | 基于角点检测的金属毛刺识别方法 |
CN116168025A (zh) * | 2023-04-24 | 2023-05-26 | 日照金果粮油有限公司 | 一种油幕式油炸花生生产*** |
CN116188462A (zh) * | 2023-04-24 | 2023-05-30 | 深圳市翠绿贵金属材料科技有限公司 | 一种基于视觉鉴定的贵金属质量检测方法及*** |
CN116228772A (zh) * | 2023-05-09 | 2023-06-06 | 聊城市检验检测中心 | 一种生鲜食品变质区域快速检测方法及*** |
CN116523901A (zh) * | 2023-06-20 | 2023-08-01 | 东莞市京品精密模具有限公司 | 一种基于计算机视觉的冲切模具检测方法 |
CN116630312A (zh) * | 2023-07-21 | 2023-08-22 | 山东鑫科来信息技术有限公司 | 一种恒力浮动打磨头打磨质量视觉检测方法 |
CN116682107A (zh) * | 2023-08-03 | 2023-09-01 | 山东国宏生物科技有限公司 | 基于图像处理的大豆视觉检测方法 |
CN116703892A (zh) * | 2023-08-01 | 2023-09-05 | 东莞市京品精密模具有限公司 | 一种基于图像数据的锂电池切刀磨损评估预警方法 |
CN116824516A (zh) * | 2023-08-30 | 2023-09-29 | 中冶路桥建设有限公司 | 一种涉路施工安全监测及管理*** |
CN116883401A (zh) * | 2023-09-07 | 2023-10-13 | 天津市生华厚德科技有限公司 | 一种工业产品生产质量检测*** |
CN116993628A (zh) * | 2023-09-27 | 2023-11-03 | 四川大学华西医院 | 一种用于肿瘤射频消融引导的ct图像增强*** |
CN116993731A (zh) * | 2023-09-27 | 2023-11-03 | 山东济矿鲁能煤电股份有限公司阳城煤矿 | 基于图像的盾构机刀头缺陷检测方法 |
CN117408988A (zh) * | 2023-11-08 | 2024-01-16 | 北京维思陆科技有限公司 | 基于人工智能的病灶图像分析方法及装置 |
CN117474977A (zh) * | 2023-12-27 | 2024-01-30 | 山东旭美尚诺装饰材料有限公司 | 基于机器视觉的欧松板凹坑快速检测方法及*** |
CN117723548A (zh) * | 2023-12-14 | 2024-03-19 | 东莞市毅廷音响科技有限公司 | 一种汽车喇叭生产质量检测方法 |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112782180A (zh) * | 2020-12-23 | 2021-05-11 | 深圳市杰恩世智能科技有限公司 | 一种产品外观瑕疵、污点的检测方法 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9014265B1 (en) * | 2011-12-29 | 2015-04-21 | Google Inc. | Video coding using edge detection and block partitioning for intra prediction |
CN104700421A (zh) * | 2015-03-27 | 2015-06-10 | 中国科学院光电技术研究所 | 一种基于canny的自适应阈值的边缘检测算法 |
CN104809800A (zh) * | 2015-04-14 | 2015-07-29 | 深圳怡化电脑股份有限公司 | 提取纸币拼接痕迹的预处理方法、拼接钞识别方法及装置 |
CN107292897A (zh) * | 2016-03-31 | 2017-10-24 | 展讯通信(天津)有限公司 | 用于yuv域的图像边缘提取方法、装置及终端 |
-
2018
- 2018-09-10 CN CN201880087301.9A patent/CN111630563B/zh active Active
- 2018-09-10 WO PCT/CN2018/104892 patent/WO2020051746A1/zh active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9014265B1 (en) * | 2011-12-29 | 2015-04-21 | Google Inc. | Video coding using edge detection and block partitioning for intra prediction |
CN104700421A (zh) * | 2015-03-27 | 2015-06-10 | 中国科学院光电技术研究所 | 一种基于canny的自适应阈值的边缘检测算法 |
CN104809800A (zh) * | 2015-04-14 | 2015-07-29 | 深圳怡化电脑股份有限公司 | 提取纸币拼接痕迹的预处理方法、拼接钞识别方法及装置 |
CN107292897A (zh) * | 2016-03-31 | 2017-10-24 | 展讯通信(天津)有限公司 | 用于yuv域的图像边缘提取方法、装置及终端 |
Cited By (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115294141B (zh) * | 2022-10-10 | 2023-03-10 | 惠智赋能(滨州)信息科技服务有限公司 | 一种基于声纳图像的深海渔网检测方法 |
CN115294141A (zh) * | 2022-10-10 | 2022-11-04 | 惠智赋能(滨州)信息科技服务有限公司 | 一种基于声纳图像的深海渔网检测方法 |
CN115564767A (zh) * | 2022-11-10 | 2023-01-03 | 深圳市岑科实业有限公司 | 基于机器视觉的电感绕线质量监测方法 |
CN115564767B (zh) * | 2022-11-10 | 2023-04-07 | 深圳市岑科实业有限公司 | 基于机器视觉的电感绕线质量监测方法 |
CN115908429A (zh) * | 2023-03-08 | 2023-04-04 | 山东歆悦药业有限公司 | 一种泡脚药粉研磨精度检测方法及*** |
CN115984271A (zh) * | 2023-03-20 | 2023-04-18 | 山东鑫科来信息技术有限公司 | 基于角点检测的金属毛刺识别方法 |
CN116188462B (zh) * | 2023-04-24 | 2023-08-11 | 深圳市翠绿贵金属材料科技有限公司 | 一种基于视觉鉴定的贵金属质量检测方法及*** |
CN116168025A (zh) * | 2023-04-24 | 2023-05-26 | 日照金果粮油有限公司 | 一种油幕式油炸花生生产*** |
CN116188462A (zh) * | 2023-04-24 | 2023-05-30 | 深圳市翠绿贵金属材料科技有限公司 | 一种基于视觉鉴定的贵金属质量检测方法及*** |
CN116228772A (zh) * | 2023-05-09 | 2023-06-06 | 聊城市检验检测中心 | 一种生鲜食品变质区域快速检测方法及*** |
CN116523901B (zh) * | 2023-06-20 | 2023-09-19 | 东莞市京品精密模具有限公司 | 一种基于计算机视觉的冲切模具检测方法 |
CN116523901A (zh) * | 2023-06-20 | 2023-08-01 | 东莞市京品精密模具有限公司 | 一种基于计算机视觉的冲切模具检测方法 |
CN116630312A (zh) * | 2023-07-21 | 2023-08-22 | 山东鑫科来信息技术有限公司 | 一种恒力浮动打磨头打磨质量视觉检测方法 |
CN116630312B (zh) * | 2023-07-21 | 2023-09-26 | 山东鑫科来信息技术有限公司 | 一种恒力浮动打磨头打磨质量视觉检测方法 |
CN116703892A (zh) * | 2023-08-01 | 2023-09-05 | 东莞市京品精密模具有限公司 | 一种基于图像数据的锂电池切刀磨损评估预警方法 |
CN116703892B (zh) * | 2023-08-01 | 2023-11-14 | 东莞市京品精密模具有限公司 | 一种基于图像数据的锂电池切刀磨损评估预警方法 |
CN116682107A (zh) * | 2023-08-03 | 2023-09-01 | 山东国宏生物科技有限公司 | 基于图像处理的大豆视觉检测方法 |
CN116682107B (zh) * | 2023-08-03 | 2023-10-10 | 山东国宏生物科技有限公司 | 基于图像处理的大豆视觉检测方法 |
CN116824516A (zh) * | 2023-08-30 | 2023-09-29 | 中冶路桥建设有限公司 | 一种涉路施工安全监测及管理*** |
CN116824516B (zh) * | 2023-08-30 | 2023-11-21 | 中冶路桥建设有限公司 | 一种涉路施工安全监测及管理*** |
CN116883401B (zh) * | 2023-09-07 | 2023-11-10 | 天津市生华厚德科技有限公司 | 一种工业产品生产质量检测*** |
CN116883401A (zh) * | 2023-09-07 | 2023-10-13 | 天津市生华厚德科技有限公司 | 一种工业产品生产质量检测*** |
CN116993731A (zh) * | 2023-09-27 | 2023-11-03 | 山东济矿鲁能煤电股份有限公司阳城煤矿 | 基于图像的盾构机刀头缺陷检测方法 |
CN116993628A (zh) * | 2023-09-27 | 2023-11-03 | 四川大学华西医院 | 一种用于肿瘤射频消融引导的ct图像增强*** |
CN116993628B (zh) * | 2023-09-27 | 2023-12-08 | 四川大学华西医院 | 一种用于肿瘤射频消融引导的ct图像增强*** |
CN116993731B (zh) * | 2023-09-27 | 2023-12-19 | 山东济矿鲁能煤电股份有限公司阳城煤矿 | 基于图像的盾构机刀头缺陷检测方法 |
CN117408988A (zh) * | 2023-11-08 | 2024-01-16 | 北京维思陆科技有限公司 | 基于人工智能的病灶图像分析方法及装置 |
CN117408988B (zh) * | 2023-11-08 | 2024-05-14 | 北京维思陆科技有限公司 | 基于人工智能的病灶图像分析方法及装置 |
CN117723548A (zh) * | 2023-12-14 | 2024-03-19 | 东莞市毅廷音响科技有限公司 | 一种汽车喇叭生产质量检测方法 |
CN117474977A (zh) * | 2023-12-27 | 2024-01-30 | 山东旭美尚诺装饰材料有限公司 | 基于机器视觉的欧松板凹坑快速检测方法及*** |
CN117474977B (zh) * | 2023-12-27 | 2024-03-22 | 山东旭美尚诺装饰材料有限公司 | 基于机器视觉的欧松板凹坑快速检测方法及*** |
Also Published As
Publication number | Publication date |
---|---|
CN111630563A (zh) | 2020-09-04 |
CN111630563B (zh) | 2022-02-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020051746A1 (zh) | 图像的边缘检测方法、图像处理设备及计算机存储介质 | |
US8902053B2 (en) | Method and system for lane departure warning | |
CN110009638B (zh) | 基于局部统计特征的桥梁拉索图像外观缺陷检测方法 | |
JP6654849B2 (ja) | コンクリートの表面ひび割れの検出方法 | |
CN113109368B (zh) | 玻璃裂纹检测方法、装置、设备及介质 | |
KR101609303B1 (ko) | 카메라 캘리브레이션 방법 및 그 장치 | |
CN110610150B (zh) | 一种目标运动物体的跟踪方法、装置、计算设备和介质 | |
TWI479431B (zh) | 物件追蹤方法 | |
JP6679858B2 (ja) | 対象の遮蔽を検出する方法と装置 | |
KR102085035B1 (ko) | 객체 인식을 위한 객체 후보영역 설정방법 및 장치 | |
CN111060014B (zh) | 一种基于机器视觉的在线自适应烟丝宽度测量方法 | |
CN109447011B (zh) | 红外对蒸汽管道泄露的实时监控方法 | |
Heydari et al. | An industrial image processing-based approach for estimation of iron ore green pellet size distribution | |
JP6572411B2 (ja) | レール検出装置 | |
CN117557571B (zh) | 基于图像增强的合金电阻焊接缺陷视觉检测方法及*** | |
KR102195940B1 (ko) | 적응적 비최대억제 방법을 이용하는 딥러닝기반 영상객체 탐지를 위한 장치 및 방법 | |
CN106530273B (zh) | 高精度fpc直线线路检测与缺陷定位方法 | |
CN111630565B (zh) | 图像处理方法、边缘提取方法、处理设备及存储介质 | |
JP6199799B2 (ja) | 自発光材料画像処理装置及び自発光材料画像処理方法 | |
KR102133330B1 (ko) | 구조물의 균열 인식 장치 및 방법 | |
CN109558881B (zh) | 一种基于计算机视觉的危岩崩塌监控方法 | |
JP4560434B2 (ja) | 変化領域抽出方法およびこの方法のプログラム | |
KR102292602B1 (ko) | 딥러닝 및 이미지 프로세싱 기반 볼트 풀림 검출 방법 | |
JP2020076607A (ja) | 鋼材成分識別装置、鋼材成分識別方法、及び鋼材成分識別プログラム | |
Tang et al. | Robust vehicle edge detection by cross filter method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18933301 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18933301 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 13.08.2021) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18933301 Country of ref document: EP Kind code of ref document: A1 |