CN112991253B - Central area determining method, foreign matter removing device and detecting equipment - Google Patents

Central area determining method, foreign matter removing device and detecting equipment Download PDF

Info

Publication number
CN112991253B
CN112991253B CN201911215303.4A CN201911215303A CN112991253B CN 112991253 B CN112991253 B CN 112991253B CN 201911215303 A CN201911215303 A CN 201911215303A CN 112991253 B CN112991253 B CN 112991253B
Authority
CN
China
Prior art keywords
pixel
pixels
area
layer
foreign object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911215303.4A
Other languages
Chinese (zh)
Other versions
CN112991253A (en
Inventor
刘松
吴明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Meyer Optoelectronic Technology Inc
Original Assignee
Hefei Meyer Optoelectronic Technology Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Meyer Optoelectronic Technology Inc filed Critical Hefei Meyer Optoelectronic Technology Inc
Priority to CN201911215303.4A priority Critical patent/CN112991253B/en
Publication of CN112991253A publication Critical patent/CN112991253A/en
Application granted granted Critical
Publication of CN112991253B publication Critical patent/CN112991253B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/40Filling a planar surface by adding surface attributes, e.g. colour or texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a method for determining a central area of a target, a method and a device for removing foreign matters and foreign matter detection equipment, wherein the method for determining the central area comprises the following steps: acquiring an image of a target area; deleting pixels from the target area layer by layer from outside to inside so that the target area is continuously reduced and the number of the connected domains in the target area is unchanged until a preset number of pixels remain; the remaining preset number of pixels is determined as the central area of the target. Compared with the center of mass of the direct target area serving as the center area, the calculated center area cannot be separated from the target area, and the accuracy is higher.

Description

Central area determining method, foreign matter removing device and detecting equipment
Technical Field
The invention relates to the technical field of computers, in particular to a method for determining a central area of a target, a method and a device for removing foreign matters and foreign matter detection equipment.
Background
A large amount of agricultural products such as wheat, corn and the like are stored in China every year, but various impurities such as stones, hair and the like are introduced in the airing and storage processes of the grains, so that the quality of the grains is reduced, and meanwhile, the safety problem in eating is also brought. It is necessary to detect and identify foreign matters in these grains and remove these foreign matters. In the process of removing, removing impurities through the central area of the impurity object is an effective removing mode, and the centroid of the impurity object is usually used as the central area of removing.
In the process of implementing the present invention, the inventors found that there are at least the following problems in the related art: in the process of removing the foreign object, if the foreign object is an elongated foreign object such as hair, the centroid may not be found on the foreign object, that is, the accuracy of determining the center area is low.
Disclosure of Invention
The invention aims to provide a method and a device for determining a central area and a foreign matter removing point of a target, a foreign matter removing method and foreign matter detection equipment, so as to improve the accuracy of determining the central area.
In a first aspect, an embodiment of the present invention provides a method for determining a central area of a target, where the method includes:
acquiring an image of a target area;
Deleting pixels from the target area layer by layer from outside to inside so that the target area is continuously reduced and the number of the connected domains in the target area is unchanged until a preset number of pixels remain;
The remaining preset number of pixels is determined as the central area of the target.
According to the invention, the pixels with fixed widths are deleted in each round, the outermost layer is deleted firstly, then the inner layer is deleted, so that the layer-by-layer deletion is realized until the last pixels with certain widths, namely the preset number of pixels, are left to be deleted, and the rest pixels are the central area required to be obtained. The basic principle of the process is to simulate the condition of burning weeds, burning a single pixel with a fixed width in each round, burning the outermost layer, burning one layer inside until the last pixel with a certain width is remained, namely a preset number of pixels, stopping burning, and the remained pixels are the central area required to be obtained. Compared with the center of mass of the direct target area serving as the center area, the calculated center area cannot be separated from the target area, and the accuracy is higher.
Optionally, the step of acquiring the target area image includes:
Identifying a target area from the original image;
Filling a first preset color in a target area, and filling a second preset color in other areas to obtain an image of the target area, wherein one of the first preset color and the second preset color is black, and the other one is white.
Optionally, the step of deleting pixels from the target area layer by layer from outside to inside includes:
and judging whether each pixel is an external pixel to be deleted under the current layer according to the gray value of each pixel and the gray values of pixels adjacent to the pixel in the current residual area corresponding to the target area, traversing the current residual area, deleting the existing external pixels to be deleted, and repeating the steps until the number of the pixels in the current residual area is a preset number.
Optionally, for each pixel in the current remaining area corresponding to the target area, determining whether the each pixel is an external pixel to be deleted under the current layer according to the gray value of the each pixel and the gray values of the pixels adjacent to the each pixel, including:
Judging whether the 3X 3 neighborhood pixel gray value distribution of each pixel is matched with any one of preset 3X 3 neighborhood pixel gray value distribution templates of a plurality of pixels to be deleted, and if so, determining that each pixel is an external pixel to be deleted under the current layer.
In a second aspect, an embodiment of the present invention provides a foreign object removal method, including:
Acquiring an image of a foreign object region;
Deleting pixels from the foreign object area layer by layer from outside to inside so that the foreign object area is continuously reduced and the number of the connected domains in the foreign object area is unchanged until a preset number of pixels remain;
determining at least one pixel of the remaining preset number of pixels as a culling point;
And removing the foreign matters according to the determined removing points.
Compared with the center of mass of a direct target area serving as a central area, the center area obtained by the method cannot be separated from the target area, and the accuracy is higher. One pixel in the residual preset number of pixels is determined to be a removing point of the foreign matter, the position of the removing point in the central area of the foreign matter can be ensured without separating from the foreign matter, the foreign matter is blown or pinched through the removing point, and the removing efficiency is high.
Optionally, the step of acquiring the foreign object region image includes:
acquiring an image to be detected of a material, and identifying a foreign matter area according to the image to be detected;
Filling a first preset color into the foreign object region, and filling a second preset color into the other region to obtain an image of the foreign object region, wherein one of the first preset color and the second preset color is black, and the other is white.
Optionally, the step of acquiring the image to be detected and identifying the foreign object area according to the image to be detected includes:
And acquiring an image to be detected, and inputting the image to be detected into a trained neural network-based recognition model to recognize a foreign body region.
Optionally, the step of deleting pixels from the foreign object region layer by layer from outside to inside includes:
and judging whether each pixel is an external pixel to be deleted under the current layer according to the gray value of each pixel and the gray value of the pixel adjacent to the pixel in the current residual area corresponding to the foreign object area, traversing the current residual area, deleting the existing external pixel to be deleted, and repeating the steps until the number of the pixels in the current residual area is a preset number.
Optionally, the step of deleting pixels from the foreign object region layer by layer from outside to inside includes:
calculating Hu moment of the foreign object region;
Judging whether the Hu moment is smaller than a preset threshold value, and if so, deleting pixels from outside to inside layer by layer for the foreign object area.
Optionally, the method further comprises:
And if the Hu moment is not smaller than a preset threshold value, calculating the mass center of the foreign object region, and determining the determined mass center as a removing point of the foreign object.
Optionally, the step of removing the foreign matter according to the determined removing points includes:
Blowing air to the removing points by using a nozzle to remove the foreign matters, or clamping the removing points by using a robot to remove the foreign matters. At least one pixel in the residual preset number of pixels is determined to be a removing point of the foreign matter, the position of the removing point in the central area of the foreign matter can be ensured without separating from the foreign matter, the foreign matter is blown or pinched through the removing point, and the removing efficiency is high.
In a third aspect, an embodiment of the present invention provides a foreign matter removal apparatus, including:
The acquisition module is used for acquiring an image of the foreign object area;
The pixel deleting module is used for deleting pixels from the foreign object area layer by layer from outside to inside so that the foreign object area is continuously reduced and the number of the connected domains in the foreign object area is unchanged until a preset number of pixels remain;
a reject point determining module, configured to determine at least one pixel of the remaining preset number of pixels as a reject point;
And the rejecting module is used for rejecting the foreign matters according to the determined rejecting points.
In a fourth aspect, an embodiment of the present invention provides a foreign matter detection apparatus that eliminates detected foreign matter according to any one of the above-described foreign matter elimination methods.
Optionally, the detection device is a color sorter or an X-ray foreign matter detector for detecting tea or grains.
Drawings
The foregoing and/or additional aspects and advantages of the invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings, in which:
fig. 1 is a flow chart of a method for determining a central area of a target according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of position numbers in an outer pixel 3×3 neighborhood gray value distribution template according to an embodiment of the present invention;
FIG. 3 is a graph of a3×3 neighborhood gray value distribution corresponding to distribution 1 in Table 1;
Fig. 4 is a central area (specifically, a center point) of hair determined according to the related art;
FIG. 5 is a central area (specifically, a center point) of hair determined in accordance with an embodiment of the present invention;
fig. 6 is a central area (specifically, a central point) of a stone determined according to the related art;
FIG. 7 is a central region (specifically, a center point) of a stone, as determined in accordance with an embodiment of the present invention;
FIG. 8 is a flow chart of a method for removing foreign matters according to an embodiment of the present invention;
Fig. 9 is a schematic structural diagram of a foreign object removing apparatus according to an embodiment of the present invention.
Detailed Description
Embodiments of the present invention are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are illustrative and intended to explain the present invention and should not be construed as limiting the invention.
In the related art, the technical problem of low accuracy of determining the central area exists, and in order to solve the technical body, the embodiment of the invention provides a target central area determining method, a foreign matter removing device and foreign matter detecting equipment. The present invention will be described in detail below.
The embodiment of the invention provides a method for determining a central area of a target, as shown in fig. 1, the method comprises the following steps:
s101, acquiring an image of a target area. The image of the target area may be an image taken from the original image by using a circumscribed rectangle of the target area.
The step of acquiring the target area image includes:
(1) Identifying a target area from the original image; the original image can be a color image or an X-ray image, and the target area can be identified through a trained neural network-based identification model, and of course, the target area can be extracted through a traditional machine learning algorithm.
(2) Filling the target area with a first preset color, and filling the rest areas with a second preset color to obtain an image of the target area, wherein one of the first preset color and the second preset color is black, and the other is white, that is, the target area is black after filling and the outside of the target area is white, or the target area is white and the outside of the target area is black. The specific step can be obtained by performing binarization processing on the image of the identified target area, namely 255 in the target area and 0 outside the target area, or 0 in the target area and 255 outside the target area.
S102, deleting pixels from the target area layer by layer from outside to inside so that the target area is continuously reduced and the number of the connected domains in the target area is unchanged until a preset number of pixels remain.
The target area is continuously reduced in the process of deleting the pixel points every round, and the number of connected domains is kept unchanged in the process. Specifically, for each pixel in the current residual area corresponding to the target area, judging whether the pixel is an external pixel to be deleted under the current layer according to the gray value of each pixel and the gray values of pixels adjacent to the pixel, so that the number of connected areas is unchanged in the deleting process, the current residual area is traversed, the existing external pixel to be deleted is deleted, and the steps are repeated until the number of pixels in the current residual area is a preset number, so that the pixels are deleted layer by layer from outside to inside, namely, the pixels are deleted layer by layer.
In specific implementation, the pixel corresponding to the target area is 0, the pixels corresponding to the rest areas are 255, and a pixel gray value distribution template in a3×3 neighborhood of the external pixel is determined, as shown in table 1 and table 2, the position number distribution is shown in fig. 2, and fig. 3 shows the pixel gray value distribution in the 3×3 neighborhood corresponding to the distribution 1 in table 1. Of course, in other embodiments, it is also possible that all of 0 and 255 are exchanged, if the pixel corresponding to the target area is 255 and the remaining corresponding pixels are 0, the change of the pixel corresponding to each distribution in table 1 and table 2 to 0 is 255 and the remaining positions are 0.
Firstly traversing pixels of a current residual area from top to bottom row by row according to a template shown in a table 1, matching the distribution condition of pixel gray values in a 3×3 neighborhood of the current pixel to be judged with the template shown in the table 1, if the distribution condition of pixel gray values in the 3×3 neighborhood of the current pixel to be judged is matched with the distribution condition of pixel gray values in a certain 3×3 neighborhood of the table 1, determining the current pixel to be judged as an external pixel to be deleted, traversing the current residual area by utilizing the template shown in the table 1, and deleting the determined pixel to be deleted. Then traversing the pixels of the current residual area from top to bottom row by row from left to right according to the template shown in the table 2, matching the distribution condition of the gray values of the pixels in the 3×3 neighborhood of the current pixel to be judged with the template shown in the table 2, if the distribution condition of the gray values of the pixels in the 3×3 neighborhood of the current pixel to be judged is the same as the distribution condition of the gray values of the pixels in a certain 3×3 neighborhood in the table 2, determining the current pixel to be judged as the external pixel to be deleted, traversing the current residual area by using the template shown in the table 2, and deleting the determined pixel to be deleted. Repeating these two steps can realize the layer-by-layer deletion of pixels from outside to inside.
The end condition of loop deleting the outer layer pixel may be: judging whether the number of pixels in the 5 multiplied by 5 neighborhood of the current pixel is less than or equal to 2, namely, the preset number is 1 or 2, and if so, stopping deleting.
TABLE 1 Pixel Gray value distribution template 1 in 3×3 neighborhood of outer pixels (255 in blank)
TABLE 2 Pixel Gray value distribution template 2 in 3×3 neighborhood of outer pixels (255 in blank)
S103, determining the remaining preset number of pixels as a central area of the target. If the preset number is 1, it can be considered that the 1 pixel is the center point of the target area.
According to the invention, the pixels with fixed widths are deleted in each round, the outermost layer is deleted firstly, then the inner layer is deleted, so that the layer-by-layer deletion is realized until the last pixels with certain widths, namely the preset number of pixels, are left to be deleted, and the rest pixels are the central area required to be obtained. The basic principle of the process is to simulate the condition of burning weeds, burning a single pixel with a fixed width in each round, burning the outermost layer, burning one layer inside until the last pixel with a certain width is remained, namely a preset number of pixels, stopping burning, and the remained pixels are the central area required to be obtained. Compared with the center of mass of the direct target area serving as the center area, the calculated center area cannot be separated from the target area, and the accuracy is higher.
As shown in fig. 4 and 5, fig. 4 shows the centroid position of the hair, and fig. 5 shows the position of the center point of the impurity hair determined by the center area determining method according to the embodiment of the present invention, it can be seen from the figure that the centroid of the hair is not on the hair, but the center point determined by the present invention is on the hair; as shown in fig. 6 and 7, fig. 6 shows the centroid position of the stone, and fig. 7 shows the center point position of the impurity stone determined by the center region determining method according to the embodiment of the present invention, and the centroid of the stone is almost the same as the center point position determined by the present invention. It can be seen that the calculated center region does not deviate from the target region, and the accuracy is higher, compared with directly taking the centroid of the target region as the center region.
Based on the same inventive concept as the above-described target center region determination method, an embodiment of the present invention provides a foreign object removal method, as shown in fig. 8, including:
S201, acquiring an image of a foreign object area.
The step of acquiring the foreign matter region image includes:
(1) Acquiring an image to be detected of a material, and identifying a foreign matter area according to the image to be detected; specifically, an image to be detected is obtained, and the image to be detected is input into a trained neural network-based recognition model to recognize a foreign object region.
For example, foreign matter detection is performed on grains, and impurities such as hair, stones and the like exist in the grains to be detected.
Training phase: a sufficient amount of sample images are acquired for each impurity for deep learning training. Because of the presence of elongated impurities such as hair, which are small in size, a high-definition camera can be used to obtain a sample image, and an industrial-grade camera of 1200 ten thousand pixels is used in the process of realizing the example.
Each pixel of the impurity is manually marked. Because of the higher resolution of the image, the image needs to be equally divided, for example, the image is equally divided into 11x11 equal divisions, and each equal division has overlapping. The aliquoted images are then used for training. Training uses common semantic segmentation algorithms such as FCNs (Fully Convolutional Networks, full convolutional networks), U-Net, deepLab, etc., where U-Net is an improvement and extension of FCNs.
Prediction stage: firstly, obtaining an image to be detected, and equally dividing the image to be detected in the same training stage; then, respectively performing deep learning prediction by using the equally divided images; and finally, the equally divided prediction result is corresponding to the original image, namely the final recognition result.
(2) Filling the foreign object region with a first preset color, and filling the rest region with a second preset color to obtain an image of the foreign object region, wherein one of the first preset color and the second preset color is black, and the other is white, that is, the target region is black after filling and the outside of the region is white, or the target region is white and the outside of the region is black. The specific step can be obtained by performing binarization processing on the image of the identified target area, namely 255 in the target area and 0 outside the target area, or 0 in the target area and 255 outside the target area.
S202, deleting pixels from the foreign object area layer by layer from outside to inside so that the foreign object area is continuously reduced and the number of the connected domains in the foreign object area is unchanged until a preset number of pixels remain.
And judging whether each pixel is an external pixel to be deleted under the current layer according to the gray value of each pixel and the gray value of the pixel adjacent to the pixel in the current residual area corresponding to the foreign object area, traversing the current residual area, deleting the existing external pixel to be deleted, and repeating the steps until the number of the pixels in the current residual area is a preset number. In this step, the foreign object region corresponds to the target region, and the other is the same as S102, so the explanation of the step may refer to the corresponding parts described above, and the description thereof will be omitted.
S203, determining at least one pixel in the rest preset number of pixels as a removing point of the foreign matter.
The residual preset number of pixels belong to the central area of the foreign object area, and any pixel is taken as a removing point, so that the accuracy of removing the foreign object can be ensured.
S204, removing the foreign matters according to the determined removing points.
According to the invention, the pixels with fixed widths are deleted in each round, the outermost layer is deleted firstly, then the inner layer is deleted, so that the layer-by-layer deletion is realized until the last pixels with certain widths, namely the preset number of pixels, are left to be deleted, and the rest pixels are the central area required to be obtained. The basic principle of the process is to simulate the condition of burning weeds, burning a single pixel with a fixed width in each round, burning the outermost layer, burning one layer inside until the last pixel with a certain width is remained, namely a preset number of pixels, stopping burning, and the remained pixels are the central area required to be obtained. Compared with the center of mass of the direct target area serving as the center area, the calculated center area cannot be separated from the target area, and the accuracy is higher. One pixel in the residual preset number of pixels is determined to be a foreign matter removing point, so that the position of the removing point in the foreign matter center area can be ensured without separating from the foreign matter, and the foreign matter can be removed accurately.
Further, the step of deleting pixels layer by layer from outside to inside for the foreign object region includes:
(1) Calculating Hu moment of the foreign object region; the process of calculating the Hu moment belongs to the prior art and is not described in detail herein.
(2) Judging whether the Hu moment is smaller than a preset threshold value, and if so, deleting the external pixels layer by layer from outside to inside for the foreign object area.
The method has the advantages that the method is used for determining the eliminating points by deleting the external pixels layer by layer for the elongated impurities, and the fact that the points outside the elongated impurities are determined to be the eliminating points can be avoided, so that the eliminating accuracy can be guaranteed.
Still further, the method further comprises:
If the Hu moment is not smaller than the preset threshold value, the mass center of the foreign object area is obtained, and the determined mass center is determined to be a removing point of the foreign object. Through the setting of the preset threshold value, a plurality of out-of-round or out-of-square foreign matter areas can be screened out, and the out-of-round or out-of-square foreign matter areas are mainly relatively slender, and the mass centers obtained for the foreign matter areas generally fall on the foreign matter areas, so that the mass centers of the foreign matter areas can be used as reject points to reject impurities.
Specifically, the centroid of the foreign object region can be calculated according to the following formula:
wherein, For the centroid coordinates, (x, y) is the coordinates of the object, and N is the number of pixels of the foreign object region.
As shown in fig. 4-7, the hair and the stones are shown with respect to the hair as an elongated object, while the stones as an object that is out of round or out of square. The method of determining the reject points by deleting pixels layer by layer for the former type of objects can avoid determining the points outside the objects as the reject points, and the determined reject points are points of the central area of the objects. For the latter object, the calculated mass center is also on the object, so the mass center can be directly used as a reject point. The ratio of the calculated amounts of the two determining modes can be controlled by setting the preset threshold, so that the overall calculated amount is reduced, and the determination efficiency and accuracy of the eliminating points are improved.
In the embodiment of the invention, the nozzles are used for blowing air to the removing points to remove the foreign matters, or the robots are used for clamping the removing points to remove the foreign matters.
The embodiment of the invention also provides a foreign matter removing point determining device, as shown in fig. 9, which comprises:
An acquisition module 91 for acquiring an image of the foreign object region.
The pixel deleting module 92 is configured to delete pixels from the foreign area layer by layer from outside to inside, so that the foreign area is continuously reduced and the number of connected domains therein is unchanged until a preset number of pixels remain.
A culling point determining module 93, configured to determine at least one pixel of the remaining preset number of pixels as a culling point.
And a rejecting module 94, configured to reject the foreign objects according to the determined reject points.
The embodiment of the invention also provides a foreign matter detection device which eliminates the detected foreign matters according to any foreign matter eliminating method. In one embodiment, the detection device is a color sorter or an X-ray foreign matter detector for tea or grain detection. The foreign matter detection apparatus has at least the same advantageous effects as those of the foreign matter removal method described above, and the corresponding parts will be referred to herein without any description.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, schematic representations of the above terms are not necessarily directed to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In the description of the present invention, the meaning of "plurality" means at least two, for example, two, three, etc., unless specifically defined otherwise.

Claims (12)

1. A method of determining a central region of an object, the method comprising:
acquiring an image of a target area;
Deleting pixels from the target area layer by layer from outside to inside so that the target area is continuously reduced and the number of the connected domains in the target area is unchanged until a preset number of pixels remain;
determining the remaining preset number of pixels as a central area of the target;
The step of deleting pixels from the target area layer by layer from outside to inside comprises the following steps:
And judging whether each pixel is an external pixel to be deleted under the current layer according to the gray value of each pixel and the gray value of the adjacent pixel aiming at each pixel in the current residual area corresponding to the target area, ensuring that the number of connected domains is unchanged in the deleting process, traversing the current residual area, deleting the existing external pixels to be deleted, and repeating the steps until the number of the pixels in the current residual area is a preset number.
2. The method of claim 1, wherein the step of acquiring the image of the target area comprises:
Identifying a target area from the original image;
Filling a first preset color in a target area, and filling a second preset color in other areas to obtain an image of the target area, wherein one of the first preset color and the second preset color is black, and the other one is white.
3. The method according to claim 1 or 2, wherein for each pixel in the current remaining area corresponding to the target area, determining whether the each pixel is an external pixel to be deleted under the current layer according to the gray value of the each pixel and the gray values of the pixels adjacent thereto, includes: judging whether the 3X 3 neighborhood pixel gray value distribution of each pixel is matched with any one of preset 3X 3 neighborhood pixel gray value distribution templates of a plurality of pixels to be deleted, and if so, determining that each pixel is an external pixel to be deleted under the current layer.
4. A foreign matter removal method, characterized by comprising:
Acquiring an image of a foreign object region;
Deleting pixels from the foreign object area layer by layer from outside to inside so that the foreign object area is continuously reduced and the number of the connected domains in the foreign object area is unchanged until a preset number of pixels remain;
determining at least one pixel of the remaining preset number of pixels as a culling point;
removing the foreign matters according to the determined removing points;
The step of deleting pixels from the foreign object region layer by layer from outside to inside comprises the steps of: and judging whether each pixel is an external pixel to be deleted under the current layer according to the gray value of each pixel and the gray value of the adjacent pixel aiming at each pixel in the current residual area corresponding to the foreign object area, ensuring that the number of connected domains is unchanged in the deleting process, traversing the current residual area, deleting the existing external pixels to be deleted, and repeating the steps until the number of the pixels in the current residual area is a preset number.
5. The method of claim 4, wherein the step of acquiring the foreign object region image comprises:
acquiring an image to be detected of a material, and identifying a foreign matter area according to the image to be detected;
Filling a first preset color into the foreign object region, and filling a second preset color into the other region to obtain an image of the foreign object region, wherein one of the first preset color and the second preset color is black, and the other is white.
6. The method of claim 5, wherein the step of acquiring an image to be detected and identifying a foreign object region from the image to be detected comprises:
And acquiring an image to be detected, and inputting the image to be detected into a trained neural network-based recognition model to recognize a foreign body region.
7. The method of claim 4, wherein the step of deleting pixels layer by layer outside-in for the foreign object region comprises: calculating Hu moment of the foreign object region; judging whether the Hu moment is smaller than a preset threshold value, and if so, deleting pixels from outside to inside layer by layer for the foreign object area.
8. The method of claim 7, wherein the method further comprises:
And if the Hu moment is not smaller than a preset threshold value, calculating the mass center of the foreign object region, and determining the determined mass center as a removing point of the foreign object.
9. The method of claim 5, wherein the step of rejecting the foreign object according to the determined reject point comprises: blowing air to the removing points by using a nozzle to remove the foreign matters, or clamping the removing points by using a robot to remove the foreign matters.
10. A foreign matter removal device, characterized in that the device comprises:
The acquisition module is used for acquiring an image of the foreign object area;
The pixel deleting module is used for deleting pixels from the foreign object area layer by layer from outside to inside so that the foreign object area is continuously reduced and the number of the connected domains in the foreign object area is unchanged until a preset number of pixels remain;
a reject point determining module, configured to determine at least one pixel of the remaining preset number of pixels as a reject point;
the removing module is used for removing the foreign matters according to the determined removing points;
The step of deleting pixels from the foreign object region layer by layer from outside to inside comprises the steps of: and judging whether each pixel is an external pixel to be deleted under the current layer according to the gray value of each pixel and the gray value of the adjacent pixel aiming at each pixel in the current residual area corresponding to the foreign object area, ensuring that the number of connected domains is unchanged in the deleting process, traversing the current residual area, deleting the existing external pixels to be deleted, and repeating the steps until the number of the pixels in the current residual area is a preset number.
11. A foreign matter detection apparatus, characterized in that the foreign matter detection apparatus rejects detected foreign matter according to the foreign matter rejection method according to any one of claims 4 to 9.
12. The foreign matter detection apparatus of claim 11, wherein the detection apparatus is a color sorter or an X-ray foreign matter detector for tea or grain detection.
CN201911215303.4A 2019-12-02 2019-12-02 Central area determining method, foreign matter removing device and detecting equipment Active CN112991253B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911215303.4A CN112991253B (en) 2019-12-02 2019-12-02 Central area determining method, foreign matter removing device and detecting equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911215303.4A CN112991253B (en) 2019-12-02 2019-12-02 Central area determining method, foreign matter removing device and detecting equipment

Publications (2)

Publication Number Publication Date
CN112991253A CN112991253A (en) 2021-06-18
CN112991253B true CN112991253B (en) 2024-05-31

Family

ID=76331551

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911215303.4A Active CN112991253B (en) 2019-12-02 2019-12-02 Central area determining method, foreign matter removing device and detecting equipment

Country Status (1)

Country Link
CN (1) CN112991253B (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004145760A (en) * 2002-10-25 2004-05-20 Canon Inc Image processing method
CN102096917A (en) * 2010-12-22 2011-06-15 南方医科大学 Automatic eliminating method for redundant image data of capsule endoscope
JP2011163804A (en) * 2010-02-05 2011-08-25 Seiko Epson Corp Foreign matter detection device and method
WO2017088462A1 (en) * 2015-11-24 2017-06-01 乐视控股(北京)有限公司 Image processing method and device
CN107221005A (en) * 2017-05-04 2017-09-29 美的集团股份有限公司 Object detecting method and device
CN107330465A (en) * 2017-06-30 2017-11-07 清华大学深圳研究生院 A kind of images steganalysis method and device
CN108230321A (en) * 2018-01-19 2018-06-29 深圳市亿图视觉自动化技术有限公司 Defect inspection method and device
CN108982511A (en) * 2018-06-27 2018-12-11 天津大学 Rolling element rejected product elimination method
CN109598723A (en) * 2018-12-11 2019-04-09 讯飞智元信息科技有限公司 A kind of picture noise detection method and device
CN110070523A (en) * 2019-04-02 2019-07-30 广州大学 A kind of foreign matter detecting method for bottom of bottle
CN110111352A (en) * 2019-03-18 2019-08-09 北京理工雷科电子信息技术有限公司 One kind detecting false-alarm elimination method based on feature cascade SAR remote sensing images waters
CN110415237A (en) * 2019-07-31 2019-11-05 Oppo广东移动通信有限公司 Skin blemishes detection method, detection device, terminal device and readable storage medium storing program for executing

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004145760A (en) * 2002-10-25 2004-05-20 Canon Inc Image processing method
JP2011163804A (en) * 2010-02-05 2011-08-25 Seiko Epson Corp Foreign matter detection device and method
CN102096917A (en) * 2010-12-22 2011-06-15 南方医科大学 Automatic eliminating method for redundant image data of capsule endoscope
WO2017088462A1 (en) * 2015-11-24 2017-06-01 乐视控股(北京)有限公司 Image processing method and device
CN107221005A (en) * 2017-05-04 2017-09-29 美的集团股份有限公司 Object detecting method and device
CN107330465A (en) * 2017-06-30 2017-11-07 清华大学深圳研究生院 A kind of images steganalysis method and device
CN108230321A (en) * 2018-01-19 2018-06-29 深圳市亿图视觉自动化技术有限公司 Defect inspection method and device
CN108982511A (en) * 2018-06-27 2018-12-11 天津大学 Rolling element rejected product elimination method
CN109598723A (en) * 2018-12-11 2019-04-09 讯飞智元信息科技有限公司 A kind of picture noise detection method and device
CN110111352A (en) * 2019-03-18 2019-08-09 北京理工雷科电子信息技术有限公司 One kind detecting false-alarm elimination method based on feature cascade SAR remote sensing images waters
CN110070523A (en) * 2019-04-02 2019-07-30 广州大学 A kind of foreign matter detecting method for bottom of bottle
CN110415237A (en) * 2019-07-31 2019-11-05 Oppo广东移动通信有限公司 Skin blemishes detection method, detection device, terminal device and readable storage medium storing program for executing

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"基于空间连续性方向插值的图像修复算法";周春霞, 吴锡生;《计算机工程与设计》;第30卷(第4期);全文 *

Also Published As

Publication number Publication date
CN112991253A (en) 2021-06-18

Similar Documents

Publication Publication Date Title
US11922615B2 (en) Information processing device, information processing method, and storage medium
CN110175982B (en) Defect detection method based on target detection
CN110929756B (en) Steel size and quantity identification method based on deep learning, intelligent equipment and storage medium
CN108629775A (en) A kind of hot high-speed rod surface image processing method
CN108181316B (en) Bamboo strip defect detection method based on machine vision
CN113109368A (en) Glass crack detection method, device, equipment and medium
AU2019203344B2 (en) Character recognition method
CN112861654A (en) Famous tea picking point position information acquisition method based on machine vision
CN111768407A (en) Defect detection algorithm based on quick positioning
CN114040116B (en) Plastic mould good product monitoring feedback system
CN115170512A (en) Defect classification and identification method and device, storage medium and electronic equipment
CN113569859B (en) Image processing method and device, electronic equipment and storage medium
CN114627116A (en) Fabric defect identification method and system based on artificial intelligence
CN112991253B (en) Central area determining method, foreign matter removing device and detecting equipment
CN114049556A (en) Garbage classification method integrating SVM (support vector machine) and target detection algorithm
CN113095445A (en) Target identification method and device
CN117314880A (en) Image defect detection method and device
CN110694940A (en) Control method and system for adjusting blowing of spray valve in real time based on dead pixel and size
JPH1125222A (en) Method and device for segmenting character
CN114550069A (en) Piglet nipple counting method based on deep learning
CN109118503B (en) Method for quickly detecting specific target of high-resolution remote sensing image
RU2580074C1 (en) Method for automatic segmentation of half-tone complex-structured raster images
JP4238074B2 (en) Surface wrinkle inspection method
CN113516611A (en) Method and device for determining abnormal material removing area, and material sorting method and equipment
CN111222510A (en) Trolley grate bar image shooting method and system of sintering machine

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant