WO2020051746A1 - 图像的边缘检测方法、图像处理设备及计算机存储介质 - Google Patents

图像的边缘检测方法、图像处理设备及计算机存储介质 Download PDF

Info

Publication number
WO2020051746A1
WO2020051746A1 PCT/CN2018/104892 CN2018104892W WO2020051746A1 WO 2020051746 A1 WO2020051746 A1 WO 2020051746A1 CN 2018104892 W CN2018104892 W CN 2018104892W WO 2020051746 A1 WO2020051746 A1 WO 2020051746A1
Authority
WO
WIPO (PCT)
Prior art keywords
edge point
detection area
edge
neighborhood
point
Prior art date
Application number
PCT/CN2018/104892
Other languages
English (en)
French (fr)
Inventor
阳光
Original Assignee
深圳配天智能技术研究院有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳配天智能技术研究院有限公司 filed Critical 深圳配天智能技术研究院有限公司
Priority to PCT/CN2018/104892 priority Critical patent/WO2020051746A1/zh
Priority to CN201880087301.9A priority patent/CN111630563B/zh
Publication of WO2020051746A1 publication Critical patent/WO2020051746A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present application relates to the field of image detection, and in particular, to an image edge detection method, an image processing device, and a computer storage medium.
  • Edge detection is a basic operation in image processing. It can be applied to different fields. For example, in the industrial field, edge detection is used to detect the surface quality of the workpiece. Specifically, the image of the workpiece surface is obtained, and then the edge detection is performed on the image of the workpiece surface. To detect the presence of scratches on the surface of the workpiece. In specific applications, the surface of the workpiece is often scratched lightly, and the edge contrast in the image of the corresponding workpiece surface is weak, and it is not easy to detect.
  • the present application provides an image edge detection method, an image processing device, and a computer storage medium to solve the problem that it is difficult to detect weak edges in an image in the prior art.
  • the present application provides an image edge detection method.
  • the method includes: obtaining a neighborhood difference degree of each pixel in a first detection area of the image, and extracting a first edge based on the neighborhood difference degree. Point; determine the second detection area according to the first edge point, wherein the second detection area is smaller than the first detection area; perform magnification processing on the neighborhood difference of each pixel point in the second detection area, and based on the magnified neighboring area
  • the second edge point is extracted from the domain difference degree; the edge of the image is determined according to the first edge point and the second edge point.
  • the present application provides an image processing device.
  • the device includes a processor and a memory.
  • the memory stores a computer program, and the processor is configured to execute the computer program to implement the foregoing method.
  • the present application provides a computer storage medium.
  • the computer storage medium is used to store a computer program, and the computer program can be executed to implement the foregoing method.
  • the second detection area of the region to further perform edge point extraction in the local range.
  • the neighborhood difference degree of each pixel is enlarged, and the second is extracted based on the neighborhood difference degree after the enlargement process.
  • Edge points; the extraction of the entire edge point is mainly divided into two processes. First, for the larger first detection area, the first edge point is extracted according to the degree of neighborhood difference.
  • the degree of neighborhood difference of the unextracted edge points is weak, so Determine a small second detection area and enlarge its neighborhood difference to extract a second edge point, and then determine the edge of the image according to the first edge point and the second edge point.
  • two edge points are extracted.
  • the process implements detection of weak edges.
  • FIG. 1 is a schematic flowchart of an embodiment of an image edge detection method according to the present application
  • FIG. 2 is a schematic flowchart of another embodiment of an edge detection method of an image of the present application.
  • FIG. 3 is a schematic diagram of obtaining a first edge point block on an image in an embodiment shown in FIG. 2;
  • FIG. 4 is a schematic diagram of obtaining a first edge point block on an image in another embodiment in the embodiment shown in FIG. 2;
  • FIG. 5 is a schematic flowchart of extracting a second edge point in the embodiment shown in FIG. 2;
  • FIG. 6 is a schematic diagram of extracting a second edge point on an image in the embodiment shown in FIG. 2;
  • FIG. 7 is a schematic diagram of a weighting window in the embodiment shown in FIG. 2;
  • FIG. 8 is a schematic flowchart of extracting a third edge point in the embodiment shown in FIG. 2;
  • FIG. 9 is a schematic diagram of extracting a third edge point on an image in the embodiment shown in FIG. 2;
  • FIG. 10 is a schematic flowchart of an embodiment of a method for detecting a surface of a workpiece of the present application
  • FIG. 11 is a schematic structural diagram of an embodiment of an image processing apparatus according to the present application.
  • FIG. 12 is a schematic structural diagram of an embodiment of a computer storage medium of the present application.
  • This application is a method for edge detection of an image.
  • Edge detection recognizes points with obvious changes in the image. Therefore, the method of this application implements the identification of edge points based on the degree of neighborhood difference of the pixel points, which mainly includes two extraction steps. First, the edge points are extracted based on the neighborhood difference, and then the neighborhood difference is enlarged, and the edge points are extracted based on the enlarged neighborhood difference; in order to detect the weak edges in the image.
  • FIG. 1 is a schematic flowchart of an embodiment of an edge detection method of an image of the present application.
  • the edge detection method of an image of this embodiment includes the following steps.
  • S11 Obtain the neighborhood difference degree of each pixel point in the first detection area of the image, and extract the first edge point based on the neighborhood difference degree.
  • This embodiment is used to detect an image.
  • a detection area on the image is detected.
  • the detection area may be the entire image or a certain area in the image. Areas are distinguished, and the detection area in this step is referred to as a first detection area.
  • the degree of neighborhood difference of each pixel in the first detection area is obtained.
  • the degree of neighborhood difference indicates the difference between the pixel and the neighboring pixels, and can be the difference in pixel values, a step value, and a two step.
  • the edge points are extracted based on the degree of neighborhood difference, and pixels with significant differences from neighboring pixels can be extracted as edge points.
  • the above process is the first extraction of edge points in this embodiment.
  • the extracted edge point is called a first edge point.
  • the first edge point is extracted according to the degree of neighborhood difference. For the edge points with obvious differences, it is more accurate to extract according to the process of step S11. For the edge points with insignificant differences, it is generally difficult to extract edges based on the degree of difference. Therefore, in this embodiment, the following steps are used to further extract the edge points.
  • the second detection area is determined according to the first edge point. According to the first edge point that has been determined, it can be roughly known that the edge line corresponding to the edge point is in the image. Position, it is sufficient to detect the area where the edge line is in the image during the second detection. Therefore, in step S12, the second detection area is determined according to the first edge point, that is, the edge line is determined according to the first edge point. Possible areas, where the second detection area is smaller than the first detection area and the second detection area is within the first detection area. This step can determine the area for the second detection, so as to find the edge points more specifically. Can improve detection efficiency.
  • the entire image is used as the first detection area.
  • the smallest area including all the first edge points is used as the second detection area.
  • the edge of the second detection area follows the first detection area.
  • the edge has the same shape as the first detection area. For example, a dashed box is shown in FIG. 3.
  • S13 Enlarge the neighborhood difference degree of each pixel point in the second detection area, and extract a second edge point based on the neighborhood difference degree after the enlargement process.
  • the edge points not found in the above step S11 are further extracted.
  • the neighborhood difference degree of each pixel point in the second detection area is enlarged.
  • the enlargement processing can convert the pixels The difference between the point and its neighboring pixels is enlarged, so that the edge points that are not obvious are highlighted, and then the second edge point is extracted based on the neighborhood difference degree after the enlargement process. After the neighborhood difference degree is enlarged, the second The extraction of edge points is more accurate.
  • S14 Determine the edge of the image according to the first edge point and the second edge point.
  • the edges of the image can be determined according to the extracted edge points twice, for example, a fitting operation is performed on all the edge points to determine the edge lines in the image.
  • the first extraction of edge points is performed according to the neighborhood difference degree, and then the neighborhood difference degree is enlarged, and the second extraction of the edge points is performed according to the enlarged neighborhood difference degree to find the weaker difference degree.
  • Edge point the edge of the image is determined based on the edge points extracted twice, so as to detect the weak edge in the image.
  • FIG. 2 is a schematic flowchart of another embodiment of an edge detection method of an image of the present application.
  • the edge detection method of an image of this embodiment includes the following steps.
  • S21 Obtain the neighborhood difference degree of each pixel point in the first detection area of the image, and extract the first edge point based on the neighborhood difference degree.
  • the neighborhood difference degree may be a difference between pixel values, a step value, or a two step value.
  • a threshold is set for the difference value, a step value, or a two step value.
  • the values are filtered to extract the first edge point.
  • This process can be implemented using existing image edge algorithms, such as canny algorithm, sobel algorithm, and so on.
  • S22 Classify the first edge point according to the neighborhood difference degree and / or the position relationship of the first edge point to obtain at least two first edge point blocks.
  • the first edge point is classified, and each of the obtained first edge point packets includes at least one first edge point.
  • the edge points that cannot be classified can be discarded, so as to remove the interference points in the first edge point extracted in step S21; for the classification operation, it can also be used to determine according to the first edge point block in the following step S23
  • the second detection area makes the determination of the second detection area more targeted and efficient.
  • the first edge point is classified according to the neighborhood difference degree and / or the position relationship of the first edge point, and specifically includes the following: Several classifications.
  • the first classification method is to compare the neighborhood difference degree of the first edge point with a plurality of preset difference degree threshold sections, and use the first edge point whose neighborhood difference degree falls within the same difference degree threshold section as a class. Form a classification block; use the classification block as the first edge point block.
  • the first edge point is classified according to a preset plurality of difference threshold threshold segments, thereby obtaining a first edge point block. It may also be implemented in step S21.
  • the first edge point is extracted according to the neighborhood difference degree of the pixel point, multiple difference threshold segments are set, and the first edge point is filtered according to the difference threshold segment, while the neighborhood difference degree is dropped.
  • the first edge points within the same difference threshold range are used as a class to form a first edge point block.
  • FIG. 3 is a schematic diagram of obtaining a first edge point block on an image in a manner in the embodiment shown in FIG. 2.
  • the first edge point is classified according to a plurality of threshold segments of difference degree, and 4 first edge point blocks are obtained: (a 1 , a 2 , a 3 ), (b 1 , b 2 ), (c 1 , c 2 ), (d 1 , d 2 , d 3 , d 4 ).
  • the second classification method compare the neighborhood difference degree of the first edge point with a plurality of preset difference degree threshold segments, and use the first edge point whose neighborhood difference degree falls within the same difference degree threshold segment as a category, and Form a classification block; a classification block that reduces the difference between the corresponding difference threshold segments to less than a preset inter-segment difference threshold, and the shortest distance to an adjacent classification block is less than a preset distance threshold It is further classified into a secondary classification block, and the secondary classification block is used as a first edge point block.
  • the shortest distance represents the shortest distance between classification packets.
  • a classification block such as the four primary classification blocks formed in FIG. 3: (a 1 , a 2 , a 3 ), (b 1 , b 2 ), (c 1 , c 2 ), (d 1 , d 2 , d 3 , d 4 ).
  • the primary classification block is further classified as a secondary classification block, and the criterion for judging whether multiple primary classification blocks are divided into one category is: the difference between the thresholds of the difference degree thresholds corresponding to the multiple primary classification blocks. Less than the preset difference threshold between segments, and the shortest distance between each primary classification block and its adjacent primary block is less than the preset distance threshold, if there are multiple adjacent primary blocks, it is equal to each There is the shortest distance between each classification packet, and the shortest distance among multiple shortest distances is used as the shortest distance for judgment.
  • the difference between the segments is the difference between the maximum value, the difference between the minimum value, or the difference between the center values in the two difference threshold segments.
  • the shortest distance is the shortest distance from the adjacent classification block, that is, the distance between the nearest pixels in two adjacent classification blocks, such as a classification block (a 1 , a 2 , a 3 )
  • the shortest distance between the adjacent classification block (b 1 , b 2 ) is the distance between pixel point a 3 and pixel point b 1 ; the classification block (c 1 , c 2 ) is adjacent to it
  • the shortest distance between the primary classification blocks (b 1 , b 2 ) is the distance between the pixels c 1 and b 2 .
  • a classification block with a difference between segments less than a preset threshold for difference between segments and a shortest distance from an adjacent classification block that is less than the distance threshold it can be further classified as a secondary classification block.
  • the distance of the first edge point in the classification block is judged, for example, the first edge point whose distance to the adjacent first edge point is greater than a threshold is discarded.
  • FIG. 4 is a schematic diagram of obtaining a first edge point block on an image in another manner in the embodiment shown in FIG. 2.
  • the primary classification blocks (a 1 , a 2 , a 3 ), (b 1 , b 2 ), (c 1 , c 2 ), (d 1 , d 2 , d 3 , d 4 ) are further classified as Secondary classification block (a 1 , a 2 , a 3 , b 1 , b 2 , c 1 , c 2 ), (d 1 , d 2 ), where the primary classification block (c 1 , c 2 ) and the primary classification block
  • the shortest distance between the classification blocks (d 1 , d 2 ) is that the distance between c 2 and d 1 is greater than the threshold, so the primary classification block (d 1 , d 2 ) is separately classified as a secondary classification block;
  • step S22 the first edge point is classified, and after the first edge point block is obtained, the second detection area is determined according to the first edge block in the subsequent steps.
  • a second detection area may be determined between two adjacent first edge points in the first edge point block, and specifically, between the first edge points in the first edge point block may be calculated first.
  • the distance between two first edge points whose distance exceeds a set range is determined as the second detection area.
  • a first edge point that is relatively far apart is used as an end point of the second detection area, and the second detection area is determined according to the end point and the shape of the first detection area.
  • the shape of the second detection area is the same as the shape of the first detection area.
  • a second detection area may also be determined between two adjacent first edge points of adjacent first edge point packets. For two adjacent first edge point packets, between two adjacent first edge points Is determined as the second detection area. For example, two adjacent first edge points are used as endpoints of the second detection area, and the second detection area is determined according to the endpoint and the shape of the first detection area. The shape of the second detection area is the same as the shape of the first detection area .
  • the second detection area is directly determined according to the first edge point, and the distances between all the first edge points need to be judged to determine the second detection area.
  • the first area in the first edge point block is determined.
  • the second detection area can be determined by performing an edge point distance calculation, so this step S23 is more efficient.
  • the first edge point block is (a 1 , a 2 , a 3 , b 1 , b 2 , c 1 , c 2 ) and (d 1 , d 2 ), which can be determined through step S23.
  • the second detection area is two adjacent first edge points whose distance within the first edge point packet exceeds a set range: an area between the pixels a 1 and a 2 , an area between the pixels b 1 and b 2 , The area between the pixels c 1 and c 2 and between two adjacent first edge points of the adjacent first edge point block: the area between the pixels c 2 and d 1 .
  • the edge points in the second detection area are extracted.
  • S24 Enlarge the neighborhood difference degree of each pixel point in the second detection area, and extract a second edge point based on the neighborhood difference degree after the enlargement process.
  • step S24 the edge points not detected in step S21 are further extracted.
  • the neighborhood difference degree of the pixel point is enlarged to further highlight the difference between the pixel point and the neighboring pixel point.
  • the second edge point is extracted from the neighborhood difference degree of.
  • FIG. 5 is a schematic flowchart of extracting a second edge point in the embodiment shown in FIG. 2.
  • the extraction process of the second edge point includes the following steps.
  • the setting of a weighting window on the second detection area in this step S241 refers to first dividing a detection window when detecting the second detection area, that is, only for the area covered by the weighting window each time the analysis is detected. Pixels are analyzed and detected.
  • the weighting window limits the size and range of the analyzed data. Therefore, if the detection window is large, the analysis and detection data will be comprehensive, but the analysis and detection time will be longer each time. If the weighting window is smaller, then The analysis and testing time is short, but the analysis and testing data are not comprehensive enough.
  • the weighting window may be a rectangular window, a circular window, a fan window, etc. In this embodiment, it is necessary to analyze the pixels in the second detection area. Therefore, a rectangular window is generally used.
  • the weighting window may correspond to ( A rectangular window of 2n + 1) ⁇ (2n + 1) pixels, where n is an integer greater than or equal to 1, at this time, there is a center point in the weighted window of the rectangle, and there is also a center pixel in the pixels corresponding to its coverage area. point.
  • the pixels covered by the weighting window at the current position may be weighted, where the weight value corresponding to the center pixel among the pixels covered by the weighting window is greater than that corresponding to other pixels The weighting value, so that the difference between the central pixel and other pixels can be enlarged after the weighting is completed in step S242.
  • the weight value corresponding to other pixels that are farther away from the center pixel point can be set smaller, because the edge line in the image generally does not correspond to only one column of pixel points, but corresponds to multiple columns of pixel points, that is, the farther away from the edge point The larger the difference between the pixel point and the edge point. Therefore, when detecting an edge point, it is not desirable to enlarge the difference between the edge point and the pixel point next to it too much. If it is enlarged too much, it will be inconsistent with the actual situation and affect the judgment. Therefore, for the weighting window, the weighting values corresponding to other pixels that are farther away from the center pixel point can be set smaller.
  • the weighting window shown in FIG. 6 is a schematic diagram of the weighting window in the embodiment shown in FIG. 2.
  • the weighted window is a 5 ⁇ 5 rectangular window.
  • the weight of the corresponding center pixel in the weighted window is 5, the weight of the pixel that is farthest away from the center pixel is 1, and the weight of the center pixel is closer. It is 2 or 3.
  • the pixels covered by the weighting window are weighted, that is, the center pixel is multiplied by a weight of 5, the distance from the center pixel is multiplied by a weight of 2 or 3, and the distance from the center pixel is multiplied by the weight. 1.
  • the weights in the weighting window are fixed.
  • the weighting window moves relative to the detection area in the subsequent steps, other pixels are used as the center pixel, and the weight is multiplied by 5.
  • the surrounding pixel points are based on the center pixel point. Multiply the distance from far to near by 1, 2, and 3.
  • Weight the pixels that is, multiply the pixel values of the pixels by weights.
  • the pixel value is larger, while the pixels of the adjacent pixels are smaller, and the pixels of the edge line are smaller.
  • the central pixel point in the weighting window it is multiplied by a larger weight value, and its adjacent multiplied by a smaller weight value, the difference between the pixel points of the edge line and the adjacent pixel points can be enlarged, and then After weighting the pixels covered by the weighting window, the neighborhood difference between the central pixel and other pixels is calculated.
  • the neighborhood difference calculated in step S243 is the neighborhood after the center pixel is enlarged. Domain difference.
  • the calculation of the neighborhood difference degree is similar to that in step S21, and details are not described again.
  • the maximum difference among the differences between the weighted central pixel and other pixels may be used as the degree of neighborhood difference.
  • a weighted window can only complete the enlargement of the neighborhood difference degree of one central pixel.
  • each pixel in the second detection area needs to be detected. Therefore, each pixel in the second detection area needs to be subjected to a neighborhood difference magnification process to extract a second edge point. Therefore, in this step S244, the second detection area and the weighting window are relatively moved, and steps S242-S243 are executed until the second detection area is traversed to complete the enlargement of the neighborhood difference degree of all the pixels in the second detection area.
  • the relative movement of the second detection area and the weighting window may be to move the weighting window relative to the second detection area in a row direction or a column direction of the pixel with a step size of one pixel point.
  • the weighting window is moved relative to the row direction of the pixels in the second detection area, and the neighborhood difference degree is sequentially magnified for a row of pixels, and then the step length of one pixel is moved relative to the direction of the pixel columns in the second detection area.
  • the weighting window is moved relative to the row direction of the pixels of the second detection area, and the neighboring pixels of the next row are sequentially enlarged in the neighborhood.
  • the weighting window and the second detection area are relatively moved in an S shape.
  • the second edge point can be extracted based on the enlarged neighborhood difference degree.
  • the pixel point with the larger neighborhood difference degree is considered as the second edge point.
  • the second edge point can be extracted in various ways, such as the following Step S245 or step S246.
  • S245 Compare the neighborhood difference degree after the enlargement process with a preset difference threshold, and select a pixel point corresponding to the neighborhood difference degree that is greater than the difference threshold value as the second edge point.
  • the method for extracting the second edge point in step S245 is to set a difference threshold. If the neighborhood difference after the enlargement process is greater than the difference threshold, the corresponding pixel point is considered to be the second edge point.
  • S246 Sort the neighborhood difference degree after the enlargement processing from large to small, and filter out the pixels corresponding to the preset number of neighborhood difference degrees that are ranked first as the second edge point.
  • step S246 after the neighborhood difference values of all the pixels in the second detection area are enlarged, the neighborhood difference degrees after the enlargement processing are sorted from large to small, and the preset presets are filtered out.
  • the number of pixels corresponding to the number of neighborhood differences is used as the second edge point.
  • the preset number can be set according to the specific situation. If the setting is too large, it will not be able to filter the edge points. If the setting is too small, the edge will be easily lost. point.
  • steps S245 and S246 are not all performed after the neighborhood difference degree amplification processing of all the pixels in the second detection area is performed, wherein step S245 can complete the neighborhood difference degree amplification of one pixel using a weighted window Then, the execution is started, and the neighborhood difference degree after the pixel point enlargement processing is compared with a preset difference threshold value to determine whether the pixel point is an edge point. Then, the second detection area and the weighting window are relatively moved to enlarge the neighborhood difference degree of the next pixel point and determine whether it is an edge point.
  • Step S24 of extracting the second edge point can be understood with reference to FIG. 7, which is a schematic diagram of extracting the second edge point on the image in the embodiment shown in FIG. 2.
  • step S23 determines a second detection zone 4: pixels a 1, region, between the pixels 2 a b 1, b 2 a region between the pixel points c 1, c 2 between the region, pixel c 2, the region between 1 d.
  • the second edge point extraction e 1 in a 1, the region between the pixels a, b in pixel
  • the second edge point e 2 is extracted from the region of the second pixel
  • the second edge point e 3 is extracted from the region between the pixels c 1 and c 2
  • the second edge point e 4 is extracted from the region between the pixels c 2 and d 1 .
  • the first edge point and the second edge point can be fitted to determine the edge in the image. If the edge points are not enough to determine the edges accurately when processing the image, the following steps may be further used to extract the edge points in this embodiment.
  • This step S25 may be based on the principle described in steps S12 and S23 in the above embodiment, and the third detection area is determined according to the second edge point. Similarly, the third detection area is smaller than the second detection area. 7 corresponds to, for example, in FIG, 3 may determine that the third detection region: the pixels a 1, region, between pixels e e 1 2, b area, between the pixel e 2 4, d 1 between the region. After the third detection area is determined, the edge points in the third detection area are extracted.
  • S26 Perform average processing on the neighborhood difference of each pixel point in the third detection area, and extract a third edge point based on the neighborhood difference after the average processing.
  • step S26 the edge points not detected in steps S21 and S24 are further extracted.
  • the neighborhood difference degree of the pixel is averaged, and then the third edge is extracted based on the enlarged neighborhood difference degree. point.
  • FIG. 8 is a schematic flowchart of extracting a third edge point in the embodiment shown in FIG. 2.
  • the extraction of the third edge point includes the following steps.
  • S261 A mean window is set in the third detection area.
  • This step S261 is basically similar to the above step S241, and details are not described in detail.
  • S262 Perform the average processing on the neighborhood difference degree of the pixels covered by the mean window at the current position, and use as the neighborhood difference degree of the central pixel among the pixels covered by the mean window.
  • the mean difference processing is performed on the neighborhood differences of the pixels covered by the mean window at the current position, that is, the mean of the neighborhood differences of all pixels in the mean window is calculated, and then the calculated
  • the mean value of d is the processed neighborhood difference degree of the central pixel in the mean window.
  • step S244 the third detection region and the mean window are relatively moved, and the process returns to step S262 to perform the neighborhood difference average process on all the pixels in the third detection region.
  • the third edge point is extracted according to the neighborhood difference degree after the mean processing, and the pixel point with the larger neighborhood difference degree is regarded as the third edge point.
  • step S245 or step S246 the third edge point can be extracted according to the neighborhood difference degree after the average processing, and the following two methods can also be adopted.
  • S264 Compare the neighborhood difference degree after the mean value processing with a preset difference threshold, and select a pixel point corresponding to the neighborhood difference degree that is greater than the difference threshold value as the third edge point.
  • S265 Sort the neighborhood difference degrees after the average processing from large to small, and filter out pixels corresponding to a preset number of neighborhood difference degrees that are ranked first as the third edge point.
  • steps S264 and S265 are not all executed after the mean value of the neighborhood difference degrees of all the pixels in the third detection area is processed, wherein step S264 can complete the neighborhood of one pixel using a mean window
  • the difference degree average processing is executed several days later, and the neighborhood difference degree after the pixel point average processing is compared with a preset difference threshold to determine whether the pixel point is an edge point. Then, the third detection area and the mean window are relatively moved to perform the neighborhood difference mean processing on the next pixel point, and determine whether it is an edge point.
  • Step S26 of extracting the third edge point can be understood with reference to FIG. 9, which is a schematic diagram of extracting the third edge point from the image in the embodiment shown in FIG. 2.
  • Step S25 based on the third detection area is determined, in conjunction with the example of FIG. 7, step S25 determines a third detection zone 3: pixel points a 1, a region between e 1, pixel e 2, b 2 a region between the pixel point e 4, the region between 1 d.
  • the third step S26 to extract edge points in said third detection area 3 the pixels a1, region extraction 1, 2, the region between the third edge pixel point f e b between the extraction e1
  • the three edge points f 2 and the area between the pixel points e 4 and d 1 extract the third edge points f 3 and f 4 .
  • S27 Determine the edge of the image according to the first edge point, the second edge point, and the third edge point.
  • the edges of the image can be determined based on the extracted edge points three times, for example, performing a fitting operation on all the edge points to determine the Edge line.
  • the first extraction of edge points is performed according to the neighborhood difference degree, and then the neighborhood difference degree is enlarged, and the second extraction of the edge points is performed according to the enlarged neighborhood difference degree to find the weaker difference degree.
  • Edge points average the neighborhood differences again, and perform the third extraction of the edge points according to the neighborhood difference after the average processing, to further find the edge points with weaker differences; determine the image based on the three extracted edge points Edge to detect weak edges in the image.
  • the edge detection of the image can be applied to the detection of the surface of the workpiece. Therefore, a method for detecting the surface of a workpiece is further provided in this application. Please refer to FIG. This embodiment can realize the detection of the surface quality of the workpiece, such as detecting whether there is a scratch on the surface of the workpiece.
  • the detection method specifically includes the following steps.
  • the workpiece surface is photographed to obtain an image of the workpiece surface.
  • the workpiece surface may also be photographed multiple times to obtain multiple images of the workpiece surface, and then multiple images are superimposed and averaged to obtain a detection image, thereby reducing detection. Noise in the image.
  • S32 Perform edge detection on the detected image.
  • the edge detection method is used to perform edge detection on the detection image to determine the edges in the detection image, and the detection of the workpiece surface can be implemented correspondingly according to the edge detection result.
  • an edge detection algorithm of an image is applied to the surface detection of a workpiece, and weak scratches on the surface of the workpiece can be detected.
  • the application can be implemented by a detection device, and the logical process thereof is expressed by a computer program, and specifically, it is implemented by an image processing device.
  • FIG. 11 is a schematic structural diagram of an embodiment of an image processing apparatus according to the present application.
  • the image processing apparatus 100 of this embodiment includes a processor 11 and a memory 12.
  • a computer program is stored in the memory 12, and the processor is configured to execute the computer program to implement the foregoing method.
  • FIG. 12 is a schematic structural diagram of an embodiment of a computer storage medium of the present application.
  • a computer program is stored in the computer storage medium 200 and can be used to implement the method for implementing the foregoing embodiments.
  • the computer storage medium 200 may be a U disk, an optical disk, a server, or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

本申请公开了一种图像的边缘检测方法、设备及计算机存储介质,其中方法包括:获取图像的第一检测区域内的各像素点的邻域差异度,并基于邻域差异度提取出第一边缘点;根据第一边缘点确定第二检测区域,其中第二检测区域小于第一检测区域;对第二检测区域内的各像素点的邻域差异度进行放大处理,并基于放大处理后的邻域差异度提取出第二边缘点;根据第一边缘点和第二边缘点确定图像的边缘。通过本申请的方法,能够较准确的检测出图像中的弱边缘。

Description

图像的边缘检测方法、图像处理设备及计算机存储介质 【技术领域】
本申请涉及图像检测领域,特别涉及一种图像的边缘检测方法、图像处理设备及计算机存储介质。
【背景技术】
边缘检测是图像处理中的基本操作,可应用到不同的领域,例如在工业领域,利用边缘检测进行工件表面质量的检测,具体即获取工件表面的图像,然后对工件表面的图像进行边缘检测,以检测工件表面是否存在划痕。在具体应用中经常出现工件表面划痕较浅的情况,相应工件表面的图像中边缘对比度较弱,不容易被检测出来。
【发明内容】
本申请提供一种图像的边缘检测方法、图像处理设备及计算机存储介质,以解决现有技术难以检测出图像中弱边缘的问题。
为解决上述技术问题,本申请提供一种图像的边缘检测方法,该方法包括:获取图像的第一检测区域内的各像素点的邻域差异度,并基于邻域差异度提取出第一边缘点;根据第一边缘点确定第二检测区域,其中第二检测区域小于第一检测区域;对第二检测区域内的各像素点的邻域差异度进行放大处理,并基于放大处理后的邻域差异度提取出第二边缘点;根据第一边缘点和第二边缘点确定图像的边缘。
为解决上述技术问题,本申请提供一种图像处理设备,该设备包括处理器和存储器,存储器中存储有计算机程序,处理器用于执行计算机程序以实现上述方法。
为解决上述技术问题,本申请提供一种计算机存储介质,计算机存储介质用于存储计算机程序,计算机程序能够被执行以实现上述方法。
本申请对图像进行边缘检测时,首先获取图像第一检测区域内个像素点的邻域差异度,并基于邻域差异度提取出第一边缘点;然后根据第一边缘点确定小于第一检测区域的第二检测区域,以进一步进行局部范围内的边缘点提取,在第二检测区域,对各像素点的邻域差异度进行放大处理,并基于放大处理后的邻域差异度提取第二边缘点;整个边缘点的提取主要分为两个过程,首先对于较大的第一检测区域,根据邻域差异度提取第一边缘点,未提取出的边缘点邻域差异度较弱,因而确定较小的第二检测区域,并放大其邻域差异度后以提取第二边缘点,然后根据第一边缘点和第二边缘点确定图像的边缘,本申请方法中通过两个边缘点提取过程实现对微弱边缘的检测。
【附图说明】
图1是本申请图像的边缘检测方法一实施例的流程示意图;
图2是本申请图像的边缘检测方法另一实施例的流程示意图;
图3是图2所示实施例中在图像上以一种方式获得第一边缘点包块的示意图;
图4是图2所示实施例中在图像上以另一方式获得第一边缘点包块的示意图;
图5是图2所示实施例中提取第二边缘点的流程示意图;
图6是图2所示实施例中在图像上提取出第二边缘点的示意图;
图7是图2所示实施例中加权窗口的示意图;
图8是图2所示实施例中提取第三边缘点的流程示意图;
图9是图2所示实施例中在图像上提取出第三边缘点的示意图;
图10是本申请工件表面的检测方法一实施例的流程示意图;
图11是本申请图像处理设备一实施例的结构示意图;
图12是本申请计算机存储介质一实施例的结构示意图。
【具体实施方式】
下面将结合本申请实施方式及其附图,对本申请的技术方案进行清楚、完整地描述,显然,所描述的实施方式仅是本申请的一部分实施方式,而不是全部的实施方式。基于本申请中的实施方式,本邻域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施方式,都属于本申请保护的范围。
本申请是对图像进行边缘检测的方法,边缘检测即识别出图像中变化明显的点,因而本申请方法中根据像素点的邻域差异度来实现边缘点的识别,主要包括两个提取步骤,首先基于邻域差异度进行边缘点的提取,然后将邻域差异度放大,并基于放大后的邻域差异度进行边缘点的提取;以实现对图像中弱边缘的检测。
具体请参阅图1,图1是本申请图像的边缘检测方法一实施例的流程示意图,本实施例图像的边缘检测方法包括以下步骤。
S11:获取图像的第一检测区域内各像素点的邻域差异度,并基于邻域差异度提取出第一边缘点。
本实施例用于对图像进行检测,在本步骤S11中对图像上的检测区域进行检测,该检测区域可以是整幅图像,也可以是图像中的某一区域,为了与后面步骤中的检测区域进行区分,将本步骤中的检测区域称为第一检测区域。
本步骤S11中获取第一检测区域内各个像素点的邻域差异度,邻域差异度表示了像素点相对邻近像素点的差异,可以是像素值的差值、一阶梯度值、二阶梯度值等;继而基于邻域差异度提取边缘点,能够提取出与邻近像素点差异明显的像素点作为边缘点,上述过程为本实施例中第一次边缘点的提取,将本步骤S11中所提取的边缘点称为第一边缘点。
第一边缘点是根据邻域差异度进行提取,对于差异较为明显的边缘点,根据本步骤S11的过程进行提取较为准确,而对于差异不明显的边缘点,一般较难根据差异度提取出边缘点,因此本实施例中采用下述步骤进行边缘点的进一步提取。
S12:根据第一边缘点确定第二检测区域。
在上述步骤S11中根据邻域差异度获取到一部分边缘点后,根据第一边缘点确定第二检测区域,根据已确定出第一边缘点,能够大概得知边缘点对应的边缘线在图像中的位置,在进行第二次检测时,对边缘线在图像中所在的区域进行检测即可,因此本步骤S12中根据第一边缘点确定第二检测区域,即根据第一边缘点确定边缘线可能存在的区域,其中第二检测区域小于第一检测区域,且第二检测区域在第一检测区域内,本步骤能确定第二次检测的区域,从而更有针对性的去找边缘点,能够提高检测效率。
具体来说,例如将整幅图像作为第一检测区域,获得第一边缘点后,以包括所有第一边缘点的最小区域作为第二检测区域,第二检测区域的边缘随着第一检测区域的边缘,与第一检测区域有同样的形状。例如图3中所示的虚线框。
S13:对第二检测区域内的各像素点的邻域差异度进行放大处理,并基于放大处理后的邻域差异度提取出第二边缘点。
本步骤S13是对上述步骤S11中未找到的边缘点进行进一步提取,在确定第二检测区域后,对第二检测区域内的各像素点的邻域差异度进行放大处理,放大处理能够将像素点与其邻近像素点的差异放大,从而凸显出差异不明显的边缘点,然后基于放大处理后的邻域差异度提取出第二边缘点,对邻域差异度进行放大处理后,能够保证第二边缘点的提取更为准确。
S14:根据第一边缘点和第二边缘点确定图像的边缘。
通过上述步骤提取出第一边缘点和第二边缘点后,即可根据两次提取的边缘点确定图像的边缘,例如对所有的边缘点进行拟合操作,从而确定图像中的边缘线。
本实施例中根据邻域差异度进行边缘点的第一次提取,然后对邻域差异度进行放大,根据放大后邻域差异度进行边缘点的第二次提取,找出差异度较弱的边缘点,根据两次提取的边缘点确定图像的边缘,从而实现对图像中微弱边缘的检测。
请参阅图2,图2是本申请图像的边缘检测方法另一实施例的流程示意图, 本实施例图像的边缘检测方法包括以下步骤。
S21:获取图像的第一检测区域内各像素点的邻域差异度,并基于邻域差异度提取出第一边缘点。
本步骤S21与上述实施例中的步骤S11类似,相同部分不再赘述。本实施例中邻域差异度可以是像素值的差值、一阶梯度值或二阶梯度值,在提取第一边缘点时,设置一阈值以对差值、一阶梯度值或二阶梯度值进行筛选,从而提取出第一边缘点。该过程可利用现有的图像边缘算法实现,例如canny算法、sobel算法等。
S22:根据第一边缘点的邻域差异度和/或位置关系对第一边缘点进行分类,以获得至少两个第一边缘点包块。
本步骤S22中即对第一边缘点进行分类操作,所获得的每一第一边缘点包块中均包括至少一个第一边缘点。在分类过程中,可以舍弃无法分类的边缘点,从而去除步骤S21提取出的第一边缘点中的干扰点;对于分类操作,还可以用作在下述步骤S23中根据第一边缘点包块确定第二检测区域时,使得第二检测区域的确定更加具有针对性和更为高效。
分类的方式有多种,均是根据第一边缘点的特征进行分类,在本实施例中根据第一边缘点的邻域差异度和/或位置关系对第一边缘点进行分类,具体包括以下几种分类方式。
第一种分类方式,比较第一边缘点的邻域差异度和预设的多个差异度阈值段,将邻域差异度落在同一差异度阈值段内的第一边缘点作为一类,以形成一次分类包块;将一次分类包块作为第一边缘点包块。
本分类方式可以是在步骤S21提取第一边缘点后,根据预设的多个差异度阈值段将第一边缘点进行分类,从而得到第一边缘点包块。也可是在步骤S21中实现,在根据像素点的邻域差异度提取第一边缘点时,即设置多个差异阈值段,根据差异阈值段筛选第一边缘点的同时,将邻域差异度落在同一差异度阈值段内的第一边缘点作为一类,形成第一边缘点包块。
对于第一种分类方式,可结合图3理解,图3是图2所示实施例中在图像上以一种方式获得第一边缘点包块的示意图。其中,根据多个差异度阈值段将第一边缘点进行分类,得到4个第一边缘点包块:(a 1,a 2,a 3)、(b 1,b 2)、(c 1,c 2)、(d 1,d 2,d 3,d 4)。
第二种分类方式:比较第一边缘点的邻域差异度和预设的多个差异度阈值段,将邻域差异度落在同一差异度阈值段内的第一边缘点作为一类,以形成一次分类包块;将对应的差异度阈值段之间的段间差异小于预设的段间差异阈值,且与相邻一次分类包块的最短距离小于预设的距离阈值的一次分类包块进一步归入二次分类包块,并将二次分类包块作为第一边缘点包块。这里的最短距离表示一次分类包块之间的最短距离。
在本分类方式中,首先将邻域差异度落在同一差异度阈值段内的第一边缘点作为一类,形成一次分类包块,例如图3中形成的4个一次分类包块:(a 1,a 2,a 3)、(b 1,b 2)、(c 1,c 2)、(d 1,d 2,d 3,d 4)。
然后将一次分类包块进一步归为二次分类包块,判断多个一次分类包块是否分为一类的标准为:多个一次分类包块所对应的差异度阈值段之间的段间差异小于预设的段间差异阈值,并且每个一次分类包块与其相邻的一次包块之间的最短距离小于预设的距离阈值,若存在多个相邻的一次包块,即与每个一次分类包块之间均具有最短距离,将多个最短距离中的最小值作为用于判断的最短距离。
其中,段间差异为两个差异度阈值段中最大值的差距,或最小值的差距,或中心值的差距。最短距离为与相邻一次分类包块之间的最短距离,也即两个相邻一次分类包块中距离最近的像素点的距离,例如一次分类包块(a 1,a 2,a 3)和其相邻的一次分类包块(b 1,b 2)之间的最短距离为像素点a 3与像素点b 1之间的距离;一次分类包块(c 1,c 2)与其相邻的一次分类包块(b 1,b 2)之间的最短距离为像素点c 1和b 2之间的距离。
对于段间差异小于预设的段间差异阈值,并且与相邻一次分类包块的最短 距离小于距离阈值的一次分类包块可进一步归为二次分类包块,在此分类过程中,还可进一步对一次分类包块中的第一边缘点进行距离的判断,例如将与其相邻第一边缘点之间的距离大于阈值的第一边缘点舍弃。
对于本分类方式,可结合图4进行理解,图4是图2所示实施例中在图像上以另一方式获得第一边缘点包块的示意图。其中,将一次分类包块(a 1,a 2,a 3)、(b 1,b 2)、(c 1,c 2)、(d 1,d 2,d 3,d 4)进一步归为二次分类包块(a 1,a 2,a 3,b 1,b 2,c 1,c 2)、(d 1,d 2),其中一次分类包块(c 1,c 2)与一次分类包块(d 1,d 2)之间的最短距离为c 2到d 1之间的距离大于阈值,因而将一次分类包块(d 1,d 2)单独归为二次分类包块;将二次分类包块作为第一边缘点包块,即得到2个第一边缘点包块(a 1,a 2,a 3,b 1,b 2,c 1,c 2)、(d 1,d 2)。此外,图4相对图3,舍弃了像素点d 3和d 4,即在第二种分类方式中,还可去除所提取的第一边缘点中的干扰点
本步骤S22中对第一边缘点进行分类,得到第一边缘点包块后,在后续步骤中即根据第一边缘包块确定第二检测区域。
S23:根据第一边缘点包块确定第二检测区域。
在本步骤中,可在第一边缘点包块内的两相邻第一边缘点之间确定第二检测区域,具体来说可首先计算第一边缘点包块内的第一边缘点之间的距离,对于距离超出一个设定范围内的两第一边缘点,二者之间的区域确定为第二检测区域。例如以相距比较远的第一边缘点作为第二检测区域的端点,根据该端点及第一检测区域的形状,确定第二检测区域,第二检测区域的形状与第一检测区域的形状相同。
还可在相邻第一边缘点包块的相邻两第一边缘点之间确定第二检测区域,对于两相邻第一边缘点包块之间,相邻的两第一边缘点之间的区域确定为第二检测区域。例如,以相邻的两第一边缘点作为第二检测区域的端点,根据该端点及第一检测区域的形状,确定第二检测区域,第二检测区域的形状与第一检测区域的形状相同。
直接根据第一边缘点来确定第二检测区域,需要对所有第一边缘点之间的 距离进行判断,从而确定第二检测区域,而本步骤S23中对第一边缘点包块内的第一边缘点进行距离计算即可确定第二检测区域,因而本步骤S23更为高效。
例如在图4中第一边缘点包块为(a 1,a 2,a 3,b 1,b 2,c 1,c 2)和(d 1,d 2),通过本步骤S23可确定的第二检测区域,第一边缘点包块内距离超出设定范围的两相邻第一边缘点:像素点a 1、a 2之间的区域、像素点b 1、b 2之间的区域、像素点c 1、c 2之间的区域,以及相邻第一边缘点包块的相邻两第一边缘点之间:像素点c 2、d 1之间的区域。在确定第二检测区域后,即开始对第二检测区域内的边缘点进行提取。
S24:对第二检测区域内的各像素点的邻域差异度进行放大处理,并基于放大处理后的邻域差异度提取出第二边缘点。
本步骤S24是将步骤S21中未检测出的边缘点作进一步提取,在本步骤中将像素点的邻域差异度进行放大处理,以进一步凸显像素点与邻近像素点的差异,然后基于放大后的邻域差异度提取出第二边缘点。
具体过程请参阅图5,图5是图2所示实施例中提取第二边缘点的流程示意图,第二边缘点的提取过程包括以下步骤。
S241:在第二检测区域设置一加权窗口。
本步骤S241中所说的在第二检测区域上设置加权窗口,是指在对第二检测区域进行检测时首先划分一个检测窗口,即在每次分析检测时只针对该加权窗口所覆盖范围内的像素点进行分析检测,加权窗口限定了所分析数据的大小及范围,因而若检测窗口较大,则分析检测数据全面,但每次分析检测时间会较长;相应若加权窗口较小,则分析检测时间较短,但分析检测数据不够全面。
加权窗口可以是矩形窗口、圆形窗口、扇形窗口等,在本实施例中需要对第二检测区域中的像素点进行分析,因而一般采用矩形窗口,本实施例中加权窗口可以为对应于(2n+1)×(2n+1)个像素点的矩形窗口,n为大于或等于1的整数,此时矩形的加权窗口内有一个中心点,相应其覆盖区域的像素点中也有一中心像素点。
S242:对加权窗口在当前位置所覆盖的像素点进行加权。
在第二检测区域设置加权窗口后,可对加权窗口在当前位置所覆盖的像素点进行加权,其中加权窗口所覆盖的像素点中的中心像素点所对应的加权值大于其他像素点所对应的加权值,因此在本步骤S242完成加权后能够放大中心像素点与其他像素点之间的差异。
进一步的,可以设置距离中心像素点越远的其他像素点所对应的加权值越小,由于图像中边缘线一般并不仅对应一列像素点,而是对应多列像素点,即离边缘点越远的像素点,其与边缘点的差异越大。因此在检测边缘点时,并不希望将边缘点和紧挨其的像素点之间的差异放大过多,若放大过多反而会与实际不符,影响判断。因此对于加权窗口,可设置距离中心像素点越远的其他像素点所对应的加权值越小。
例如图6所示的加权窗口,图6是图2所示实施例中加权窗口的示意图。其中,加权窗口为5×5的矩形窗口,加权窗口中对应中心像素点的权值为5,距离中心像素点距离最远的像素点的权值为1,距离中心像素点较近的权值为2或3。本步骤中对加权窗口所覆盖的像素点进行加权即对中心像素点乘以权值5,距离中心像素点较近的乘以权值2或3,距离中心像素点较远的乘以权值1。本实施例加权窗口中的权值固定,即在后续步骤中加权窗口相对检测区域移动时,其他像素点作为中心像素点,对其乘以权值5,其周围的像素点依据与中心像素点距离的从远到近依次乘以1,2,3。
S243:计算加权后的中心像素点与其他像素点之间的邻域差异度,以作为中心像素点的放大处理后的邻域差异度。
对像素点进行加权,即将像素点的像素值乘以权值,对于边缘线的像素点来说,其像素值较大,而其相邻的像素点的像素值较小,边缘线的像素点作为加权窗口中的中心像素点时,其乘以较大的权值,而其相邻的乘以较小的权值,即可放大边缘线的像素点与相邻像素点的差异,继而在对加权窗口所覆盖的像素点进行加权后,计算中心像素点与其他像素点之间的邻域差异度,该步骤S243 中所计算的邻域差异度即为中心像素点的放大处理后的邻域差异度。邻域差异度的计算与步骤S21中类似,具体不再赘述。此外,还可将加权后的中心像素点与其他像素点的差值中的最大差值作为邻域差异度。
S244:将第二检测区域和加权窗口进行相对移动,并返回步骤S242,直至加权窗口遍历第二检测区域。
对应上述步骤S241-S244,一个加权窗口只能对应完成一个中心像素点的邻域差异度放大。而对于第二检测区域,需要对其中每个像素点进行检测,因而需要对第二检测区域内的每个像素点均进行邻域差异度放大处理,从而提取出第二边缘点。因此在本步骤S244中将第二检测区域和加权窗口相对移动,并返回执行步骤S242-S243,直至遍历第二检测区域,以完成对第二检测区域内所有像素点的邻域差异度放大。
在本实施例中,第二检测区域和加权窗口的相对移动可以是以一个像素点为步长沿像素的行方向或列方向相对第二检测区域移动加权窗口。具体来说,相对第二检测区域的像素点行方向移动加权窗口,对一行像素点依次进行邻域差异度放大,然后相对第二检测区域的像素点列方向移动一个像素点的步长,再相对第二检测区域的像素点行方向移动加权窗口,对下一行像素点依次进行邻域差异度放大,加权窗口与第二检测区域进行S字型的相对移动,当加权窗口遍历第二检测区域后,则结束第二检测区域内像素点的邻域差异度放大。
根据放大后的邻域差异度即可进行第二边缘点的提取,其中邻域差异度较大的像素点则被认为是第二边缘点,可采用多种方式提取第二边缘点,例如以下步骤S245或步骤S246。
S245:比较放大处理后的邻域差异度和预设的差异阈值,筛选出大于所述差异阈值的邻域差异度所对应的像素点作为所述第二边缘点。
本步骤S245中提取第二边缘点的方式是设置一差异阈值,若放大处理后的邻域差异度大于该差异阈值,则认为其所对应的像素点为第二边缘点。
S246:将放大处理后的邻域差异度进行从大到小的排序,筛选出排序在前 的预设数量个邻域差异度所对应的像素点作为第二边缘点。
本步骤S246是在对第二检测区域内所有像素点的邻域差异值均进行放大处理后,将放大处理后的邻域差异度进行从大到小的排序,筛选出排序在前的预设数量个邻域差异度所对应的像素点作为第二边缘点,预设数量可根据具体情况设置,若设置太大,则起不到边缘点的筛选作用;若设置太小,则容易丢失边缘点。
上述步骤S245和步骤S246并不都是在第二检测区域内全部像素点的邻域差异度放大处理后才执行,其中步骤S245可在使用一加权窗口完成对一个像素点的邻域差异度放大后即开始执行,比较该像素点放大处理后的邻域差异度和预设的差异阈值,以确定该像素点是否为边缘点。然后再相对移动第二检测区域和加权窗口,以对下一个像素点进行邻域差异度放大,并判断其是否为边缘点。
对于提取第二边缘点的步骤S24,可结合图7进行理解,图7是图2所示实施例中在图像上提取出第二边缘点的示意图。
基于上述步骤S23中所确定的第二检测区域,结合图4的例子,步骤S23中确定了4个第二检测区域:像素点a 1、a 2之间的区域、像素点b 1、b 2之间的区域、像素点c 1、c 2之间的区域、像素点c 2、d 1之间的区域。
在步骤S24中即在上述4个第二检测区域内提取第二边缘点,在像素点a 1、a 2之间的区域提取第二边缘点e 1,在像素点b 1、b 2之间的区域提取第二边缘点e 2,在像素点c 1、c 2之间的区域提取第二边缘点e 3,在像素点c 2、d 1之间的区域提取第二边缘点e 4
完成本步骤S24后,提取出更多的边缘点后,则可对第一边缘点和第二边缘点进行拟合,从而确定图像中的边缘。若在对图像进行处理时,边缘点不足以较为准确的确定边缘,本实施例中还可进一步采用以下步骤对边缘点进行提取。
S25:根据第二边缘点确定第三检测区域。
本步骤S25可基于上述实施例中步骤S12、S23所述的原理,根据第二边缘点确定第三检测区域,同样第三检测区域小于第二检测区域。例如对应在图7中,可确定3个第三检测区域:像素点a 1、e 1之间的区域、像素点e 2、b 2之间的区域、像素点e 4、d 1之间的区域。在确定第三检测区域后,即开始对第三检测区域内的边缘点进行提取。
S26:对第三检测区域内的每一像素点的邻域差异度进行均值处理,并基于均值处理后的邻域差异度提取出第三边缘点。
本步骤S26是将步骤S21、S24中未检测出的边缘点作进一步提取,在本步骤中将像素点的邻域差异度进行均值处理,然后基于放大后的邻域差异度提取出第三边缘点。
具体过程请参阅图8,图8是图2所示实施例中提取第三边缘点的流程示意图,第三边缘点的提取包括以下步骤。
S261:在第三检测区域设置一均值窗口。
本步骤S261与上述步骤S241基本类似,具体不再赘述。
S262:对均值窗口在当前位置所覆盖的像素点的邻域差异度进行均值处理,并作为均值窗口所覆盖的像素点中的中心像素点的均值处理后的邻域差异度。
在第三检测区域设置均值窗口后,对均值窗口在当前位置所覆盖的像素点的邻域差异度进行均值处理,即对均值窗口内所有像素点的邻域差异度求均值,然后将所算得的均值作为均值窗口中的中心像素点的处理后邻域差异度。
S263:将第三检测区域和均值窗口进行相对移动,并返回步骤S262,直至均值窗口遍历第三检测区域。
同样,类似步骤S244,将第三检测区域和均值窗口进行相对移动,并返回步骤S262,以对第三检测区域内所有像素点进行邻域差异度均值处理。根据均值处理后的邻域差异度进行第三边缘点的提取,其中邻域差异度较大的像素点被认为是第三边缘点。类似于上述步骤S245或步骤S246,根据均值处理后的邻域差异度提取第三边缘点也可采用以下两种方式。
S264:比较均值处理后的邻域差异度和预设的差异阈值,筛选出大于差异阈值的邻域差异度所对应的像素点作为第三边缘点。
S265:将均值处理后的邻域差异度进行从大到小的排序,筛选出排序在前的预设数量个邻域差异度所对应的像素点作为第三边缘点。
同样,上述步骤S264和步骤S265并不都是在第三检测区域内全部像素点的邻域差异度均值处理后才执行,其中,步骤S264可在使用一均值窗口完成对一个像素点的邻域差异度均值处理后几开始执行,比较该像素点均值处理后的邻域差异度和预设的差异阈值,以确定该像素点是否为边缘点。然后再相对移动第三检测区域和均值窗口,以对下一个像素点进行邻域差异度均值处理,并判断其是否为边缘点。
对于提取第三边缘点的步骤S26,可结合图9进行理解,图9是图2所示实施例中在图像上提取出第三边缘点的示意图。
基于上述步骤S25中所确定的第三检测区域,结合图7的例子,步骤S25中确定了3个第三检测区域:像素点a 1、e 1之间的区域、像素点e 2、b 2之间的区域、像素点e 4、d 1之间的区域。
步骤S26中即在上述3个第三检测区域内提取第三边缘点,在像素点a1、e1之间的区域提取第三边缘点f 1,像素点e 2、b 2之间的区域提取第三边缘点f 2,像素点e 4、d 1之间的区域提取第三边缘点f 3和f 4
S27:根据第一边缘点、第二边缘点和第三边缘点确定图像的边缘。
通过上述步骤提取出第一边缘点、第二边缘点和第三边缘点后,即可根据三次提取的边缘点确定图像的边缘,例如对所有的边缘点进行拟合操作,从而确定图像中的边缘线。
本实施例中根据邻域差异度进行边缘点的第一次提取,然后对邻域差异度进行放大,根据放大后邻域差异度进行边缘点的第二次提取,找出差异度较弱的边缘点;再次对邻域差异度进行平均,根据均值处理后的邻域差异度进行边缘点的第三次提取,进一步找出差异度较弱的边缘点;根据三次提取的边缘点 确定图像的边缘,从而实现对图像中微弱边缘的检测。
对图像的边缘检测可应用于工件表面的检测中,因而本申请中进一步提出一种工件表面的检测方法,请参阅图10,图10是本申请工件表面的检测方法一实施例的流程示意图,本实施例能够实现对工件表面质量的检测,例如检测工件表面是否有划痕等。检测方法具体包括以下步骤。
S31:获取工件表面的检测图像。
在本步骤中对工件表面进行拍摄以获取工件表面的图像,还可对工件表面进行多次拍摄,以获取工件表面的多个图像,然后对多个图像进行叠加平均得到检测图像,从而降低检测图像中的噪点。
S32:对检测图像进行边缘检测。
S33:根据对检测图像的边缘检测以实现对工件表面的检测
采用上述边缘检测方法对检测图像进行边缘检测,以确定检测图像中的边缘,根据边缘检测结果即可对应实现对工件表面的检测。
本实施例将图像的边缘检测算法应用于工件的表面检测中,能够检测出工件表面的微弱划痕。
对于上述方法,均可通过一检测设备实现应用,其逻辑过程通过计算机程序来表示,具体则通过图像处理设备实现。
请参阅图11,图11是本申请图像处理设备一实施例的结构示意图。本实施例图像处理设备100包括处理器11和存储器12。存储器12中存储有计算机程序,处理器用于执行计算机程序以实现上述方法。
请参阅图12,图12是本申请计算机存储介质一实施例的结构示意图。本实施例计算机存储介质200。计算机存储介质200中存储有计算机程序,能够被执行实现上述实施例的方法,该计算机存储介质200可以是U盘、光盘、服务器等。
以上所述仅为本申请的实施方式,并非因此限制本申请的专利范围,凡是利用本申请说明书及附图内容所作的等效结构或等效流程变换,或直接或间接 运用在其他相关的技术领域,均同理包括在本申请的专利保护范围内。

Claims (12)

  1. 一种图像的边缘检测方法,其特征在于,所述方法包括:
    获取所述图像的第一检测区域内各像素点的邻域差异度,并基于所述邻域差异度提取出第一边缘点;
    根据所述第一边缘点确定第二检测区域,其中所述第二检测区域小于所述第一检测区域;
    对所述第二检测区域内的各像素点的邻域差异度进行放大处理,并基于放大处理后的所述邻域差异度提取出第二边缘点;
    根据所述第一边缘点和所述第二边缘点确定所述图像的边缘。
  2. 根据权利要求1所述的方法,其特征在于,所述根据所述第一边缘点确定第二检测区域的步骤包括:
    根据所述第一边缘点的邻域差异度和/或位置关系对所述第一边缘点进行分类,以获得至少两个第一边缘点包块,其中每一个所述第一边缘点包块分别包括至少一个所述第一边缘点;
    根据所述第一边缘点包块确定所述第二检测区域。
  3. 根据权利要求2所述的方法,其特征在于,所述根据所述第一边缘点的邻域差异度和/或位置关系对所述第一边缘点进行分类的步骤包括:
    比较所述第一边缘点的邻域差异度和预设的多个差异度阈值段,将所述邻域差异度落在同一所述差异度阈值段内的第一边缘点作为一类,以形成一次分类包块;
    基于所述一次分类包块确定所述第一边缘点包块。
  4. 根据权利要求3所述的方法,其特征在于,所述基于所述一次分类包块确定所述第一边缘点包块的步骤包括:
    将所述一次分类包块作为所述第一边缘点包块;或者
    将对应的所述差异度阈值段之间的段间差异小于预设的段间差异阈值,且与相邻的一次分类包块之间的最短距离小于预设的距离阈值的所述一次分类包块进一步 归入二次分类包块,并将所述二次分类包块作为所述第一边缘点包块。
  5. 根据权利要求2所述的方法,其特征在于,所述根据所述第一边缘点包块确定所述第二检测区域的步骤:
    在每一所述第一边缘点包块中的相邻的所述第一边缘点之间确定所述第二检测区域;
    和/或在相邻的所述第一边缘点包块中的相邻的所述第一边缘点之间确定所述第二检测区域。
  6. 根据权利要求1所述的方法,其特征在于,所述对所述第二检测区域内的各像素点的邻域差异度进行放大处理,并基于放大处理后的所述邻域差异度提取出第二边缘点的步骤包括:
    在所述第二检测区域设置一加权窗口;
    对所述加权窗口在当前位置所覆盖的像素点进行加权,其中所述加权窗口所覆盖的像素点中的中心像素点所对应的加权值大于其他像素点所对应的加权值;
    计算加权后的所述中心像素点与所述其他像素点之间的邻域差异度,以作为所述中心像素点的放大处理后的邻域差异度;
    将所述第二检测区域和所述加权窗口进行相对移动,并返回所述对所述加权窗口在当前位置所覆盖的像素点进行加权的步骤,直至所述加权窗口遍历所述第二检测区域。
  7. 根据权利要求6所述的方法,其特征在于,距离所述中心像素点越远的所述其他像素点所对应的所述加权值越小。
  8. 根据权利要求1所述的方法,其特征在于,所述对所述第二检测区域内的各像素点的邻域差异度进行放大处理,并基于放大处理后的所述邻域差异度提取出第二边缘点的步骤包括:
    比较放大处理后的邻域差异度和预设的差异阈值,筛选出大于所述差异阈值的邻域差异度所对应的像素点作为所述第二边缘点;
    或者,将放大处理后的邻域差异度进行从大到小的排序,筛选出排序在前的预 设数量个邻域差异度所对应的像素点作为所述第二边缘点。
  9. 根据权利要求1所述的方法,其特征在于,所述根据所述第一边缘点和所述第二边缘点确定所述图像的边缘的步骤包括:
    根据所述第二边缘点确定第三检测区域,所述第三检测区域小于所述第二检测区域;
    对所述第三检测区域内的每一像素点的所述邻域差异度进行均值处理,并基于均值处理后的所述邻域差异度提取出第三边缘点;
    根据所述第一边缘点、所述第二边缘点和所述第三边缘点确定所述图像的边缘。
  10. 根据权利要求9所述的方法,其特征在于,所述对所述第三检测区域内的每一像素点的所述邻域差异度进行均值处理,并基于均值处理后的所述邻域差异度提取出第三边缘点的步骤包括:
    在所述第三检测区域设置一均值窗口;
    对所述均值窗口在当前位置所覆盖的像素点的邻域差异度进行均值处理,并作为所述均值窗口所覆盖的像素点中的中心像素点的新的邻域差异度;
    将所述第三检测区域和所述均值窗口进行相对移动,并返回所述对所述均值窗口在当前位置所覆盖的像素点的邻域差异度进行均值处理的步骤,直至所述均值窗口遍历所述第三检测区域。
  11. 一种图像处理设备,其特征在于,所述设备包括处理器和存储器,所述存储器中存储有计算机程序,所述处理器用于执行所述计算机程序以实现权利要求1-10中任一项所述的方法。
  12. 一种计算机存储介质,其特征在于,所述计算机存储介质用于存储计算机程序,所述计算机程序能够被执行以实现权利要求1-10中任一项所述的方法。
PCT/CN2018/104892 2018-09-10 2018-09-10 图像的边缘检测方法、图像处理设备及计算机存储介质 WO2020051746A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2018/104892 WO2020051746A1 (zh) 2018-09-10 2018-09-10 图像的边缘检测方法、图像处理设备及计算机存储介质
CN201880087301.9A CN111630563B (zh) 2018-09-10 2018-09-10 图像的边缘检测方法、图像处理设备及计算机存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/104892 WO2020051746A1 (zh) 2018-09-10 2018-09-10 图像的边缘检测方法、图像处理设备及计算机存储介质

Publications (1)

Publication Number Publication Date
WO2020051746A1 true WO2020051746A1 (zh) 2020-03-19

Family

ID=69776940

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/104892 WO2020051746A1 (zh) 2018-09-10 2018-09-10 图像的边缘检测方法、图像处理设备及计算机存储介质

Country Status (2)

Country Link
CN (1) CN111630563B (zh)
WO (1) WO2020051746A1 (zh)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115294141A (zh) * 2022-10-10 2022-11-04 惠智赋能(滨州)信息科技服务有限公司 一种基于声纳图像的深海渔网检测方法
CN115564767A (zh) * 2022-11-10 2023-01-03 深圳市岑科实业有限公司 基于机器视觉的电感绕线质量监测方法
CN115908429A (zh) * 2023-03-08 2023-04-04 山东歆悦药业有限公司 一种泡脚药粉研磨精度检测方法及***
CN115984271A (zh) * 2023-03-20 2023-04-18 山东鑫科来信息技术有限公司 基于角点检测的金属毛刺识别方法
CN116168025A (zh) * 2023-04-24 2023-05-26 日照金果粮油有限公司 一种油幕式油炸花生生产***
CN116188462A (zh) * 2023-04-24 2023-05-30 深圳市翠绿贵金属材料科技有限公司 一种基于视觉鉴定的贵金属质量检测方法及***
CN116228772A (zh) * 2023-05-09 2023-06-06 聊城市检验检测中心 一种生鲜食品变质区域快速检测方法及***
CN116523901A (zh) * 2023-06-20 2023-08-01 东莞市京品精密模具有限公司 一种基于计算机视觉的冲切模具检测方法
CN116630312A (zh) * 2023-07-21 2023-08-22 山东鑫科来信息技术有限公司 一种恒力浮动打磨头打磨质量视觉检测方法
CN116682107A (zh) * 2023-08-03 2023-09-01 山东国宏生物科技有限公司 基于图像处理的大豆视觉检测方法
CN116703892A (zh) * 2023-08-01 2023-09-05 东莞市京品精密模具有限公司 一种基于图像数据的锂电池切刀磨损评估预警方法
CN116824516A (zh) * 2023-08-30 2023-09-29 中冶路桥建设有限公司 一种涉路施工安全监测及管理***
CN116883401A (zh) * 2023-09-07 2023-10-13 天津市生华厚德科技有限公司 一种工业产品生产质量检测***
CN116993628A (zh) * 2023-09-27 2023-11-03 四川大学华西医院 一种用于肿瘤射频消融引导的ct图像增强***
CN116993731A (zh) * 2023-09-27 2023-11-03 山东济矿鲁能煤电股份有限公司阳城煤矿 基于图像的盾构机刀头缺陷检测方法
CN117408988A (zh) * 2023-11-08 2024-01-16 北京维思陆科技有限公司 基于人工智能的病灶图像分析方法及装置
CN117474977A (zh) * 2023-12-27 2024-01-30 山东旭美尚诺装饰材料有限公司 基于机器视觉的欧松板凹坑快速检测方法及***
CN117723548A (zh) * 2023-12-14 2024-03-19 东莞市毅廷音响科技有限公司 一种汽车喇叭生产质量检测方法

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112782180A (zh) * 2020-12-23 2021-05-11 深圳市杰恩世智能科技有限公司 一种产品外观瑕疵、污点的检测方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9014265B1 (en) * 2011-12-29 2015-04-21 Google Inc. Video coding using edge detection and block partitioning for intra prediction
CN104700421A (zh) * 2015-03-27 2015-06-10 中国科学院光电技术研究所 一种基于canny的自适应阈值的边缘检测算法
CN104809800A (zh) * 2015-04-14 2015-07-29 深圳怡化电脑股份有限公司 提取纸币拼接痕迹的预处理方法、拼接钞识别方法及装置
CN107292897A (zh) * 2016-03-31 2017-10-24 展讯通信(天津)有限公司 用于yuv域的图像边缘提取方法、装置及终端

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9014265B1 (en) * 2011-12-29 2015-04-21 Google Inc. Video coding using edge detection and block partitioning for intra prediction
CN104700421A (zh) * 2015-03-27 2015-06-10 中国科学院光电技术研究所 一种基于canny的自适应阈值的边缘检测算法
CN104809800A (zh) * 2015-04-14 2015-07-29 深圳怡化电脑股份有限公司 提取纸币拼接痕迹的预处理方法、拼接钞识别方法及装置
CN107292897A (zh) * 2016-03-31 2017-10-24 展讯通信(天津)有限公司 用于yuv域的图像边缘提取方法、装置及终端

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115294141B (zh) * 2022-10-10 2023-03-10 惠智赋能(滨州)信息科技服务有限公司 一种基于声纳图像的深海渔网检测方法
CN115294141A (zh) * 2022-10-10 2022-11-04 惠智赋能(滨州)信息科技服务有限公司 一种基于声纳图像的深海渔网检测方法
CN115564767A (zh) * 2022-11-10 2023-01-03 深圳市岑科实业有限公司 基于机器视觉的电感绕线质量监测方法
CN115564767B (zh) * 2022-11-10 2023-04-07 深圳市岑科实业有限公司 基于机器视觉的电感绕线质量监测方法
CN115908429A (zh) * 2023-03-08 2023-04-04 山东歆悦药业有限公司 一种泡脚药粉研磨精度检测方法及***
CN115984271A (zh) * 2023-03-20 2023-04-18 山东鑫科来信息技术有限公司 基于角点检测的金属毛刺识别方法
CN116188462B (zh) * 2023-04-24 2023-08-11 深圳市翠绿贵金属材料科技有限公司 一种基于视觉鉴定的贵金属质量检测方法及***
CN116168025A (zh) * 2023-04-24 2023-05-26 日照金果粮油有限公司 一种油幕式油炸花生生产***
CN116188462A (zh) * 2023-04-24 2023-05-30 深圳市翠绿贵金属材料科技有限公司 一种基于视觉鉴定的贵金属质量检测方法及***
CN116228772A (zh) * 2023-05-09 2023-06-06 聊城市检验检测中心 一种生鲜食品变质区域快速检测方法及***
CN116523901B (zh) * 2023-06-20 2023-09-19 东莞市京品精密模具有限公司 一种基于计算机视觉的冲切模具检测方法
CN116523901A (zh) * 2023-06-20 2023-08-01 东莞市京品精密模具有限公司 一种基于计算机视觉的冲切模具检测方法
CN116630312A (zh) * 2023-07-21 2023-08-22 山东鑫科来信息技术有限公司 一种恒力浮动打磨头打磨质量视觉检测方法
CN116630312B (zh) * 2023-07-21 2023-09-26 山东鑫科来信息技术有限公司 一种恒力浮动打磨头打磨质量视觉检测方法
CN116703892A (zh) * 2023-08-01 2023-09-05 东莞市京品精密模具有限公司 一种基于图像数据的锂电池切刀磨损评估预警方法
CN116703892B (zh) * 2023-08-01 2023-11-14 东莞市京品精密模具有限公司 一种基于图像数据的锂电池切刀磨损评估预警方法
CN116682107A (zh) * 2023-08-03 2023-09-01 山东国宏生物科技有限公司 基于图像处理的大豆视觉检测方法
CN116682107B (zh) * 2023-08-03 2023-10-10 山东国宏生物科技有限公司 基于图像处理的大豆视觉检测方法
CN116824516A (zh) * 2023-08-30 2023-09-29 中冶路桥建设有限公司 一种涉路施工安全监测及管理***
CN116824516B (zh) * 2023-08-30 2023-11-21 中冶路桥建设有限公司 一种涉路施工安全监测及管理***
CN116883401B (zh) * 2023-09-07 2023-11-10 天津市生华厚德科技有限公司 一种工业产品生产质量检测***
CN116883401A (zh) * 2023-09-07 2023-10-13 天津市生华厚德科技有限公司 一种工业产品生产质量检测***
CN116993731A (zh) * 2023-09-27 2023-11-03 山东济矿鲁能煤电股份有限公司阳城煤矿 基于图像的盾构机刀头缺陷检测方法
CN116993628A (zh) * 2023-09-27 2023-11-03 四川大学华西医院 一种用于肿瘤射频消融引导的ct图像增强***
CN116993628B (zh) * 2023-09-27 2023-12-08 四川大学华西医院 一种用于肿瘤射频消融引导的ct图像增强***
CN116993731B (zh) * 2023-09-27 2023-12-19 山东济矿鲁能煤电股份有限公司阳城煤矿 基于图像的盾构机刀头缺陷检测方法
CN117408988A (zh) * 2023-11-08 2024-01-16 北京维思陆科技有限公司 基于人工智能的病灶图像分析方法及装置
CN117408988B (zh) * 2023-11-08 2024-05-14 北京维思陆科技有限公司 基于人工智能的病灶图像分析方法及装置
CN117723548A (zh) * 2023-12-14 2024-03-19 东莞市毅廷音响科技有限公司 一种汽车喇叭生产质量检测方法
CN117474977A (zh) * 2023-12-27 2024-01-30 山东旭美尚诺装饰材料有限公司 基于机器视觉的欧松板凹坑快速检测方法及***
CN117474977B (zh) * 2023-12-27 2024-03-22 山东旭美尚诺装饰材料有限公司 基于机器视觉的欧松板凹坑快速检测方法及***

Also Published As

Publication number Publication date
CN111630563A (zh) 2020-09-04
CN111630563B (zh) 2022-02-18

Similar Documents

Publication Publication Date Title
WO2020051746A1 (zh) 图像的边缘检测方法、图像处理设备及计算机存储介质
US8902053B2 (en) Method and system for lane departure warning
CN110009638B (zh) 基于局部统计特征的桥梁拉索图像外观缺陷检测方法
JP6654849B2 (ja) コンクリートの表面ひび割れの検出方法
CN113109368B (zh) 玻璃裂纹检测方法、装置、设备及介质
KR101609303B1 (ko) 카메라 캘리브레이션 방법 및 그 장치
CN110610150B (zh) 一种目标运动物体的跟踪方法、装置、计算设备和介质
TWI479431B (zh) 物件追蹤方法
JP6679858B2 (ja) 対象の遮蔽を検出する方法と装置
KR102085035B1 (ko) 객체 인식을 위한 객체 후보영역 설정방법 및 장치
CN111060014B (zh) 一种基于机器视觉的在线自适应烟丝宽度测量方法
CN109447011B (zh) 红外对蒸汽管道泄露的实时监控方法
Heydari et al. An industrial image processing-based approach for estimation of iron ore green pellet size distribution
JP6572411B2 (ja) レール検出装置
CN117557571B (zh) 基于图像增强的合金电阻焊接缺陷视觉检测方法及***
KR102195940B1 (ko) 적응적 비최대억제 방법을 이용하는 딥러닝기반 영상객체 탐지를 위한 장치 및 방법
CN106530273B (zh) 高精度fpc直线线路检测与缺陷定位方法
CN111630565B (zh) 图像处理方法、边缘提取方法、处理设备及存储介质
JP6199799B2 (ja) 自発光材料画像処理装置及び自発光材料画像処理方法
KR102133330B1 (ko) 구조물의 균열 인식 장치 및 방법
CN109558881B (zh) 一种基于计算机视觉的危岩崩塌监控方法
JP4560434B2 (ja) 変化領域抽出方法およびこの方法のプログラム
KR102292602B1 (ko) 딥러닝 및 이미지 프로세싱 기반 볼트 풀림 검출 방법
JP2020076607A (ja) 鋼材成分識別装置、鋼材成分識別方法、及び鋼材成分識別プログラム
Tang et al. Robust vehicle edge detection by cross filter method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18933301

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18933301

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 13.08.2021)

122 Ep: pct application non-entry in european phase

Ref document number: 18933301

Country of ref document: EP

Kind code of ref document: A1